Research, Development, Engineering assessment focused on Project Management

model_1_5
model_1_5results_rde_cdm_3_2095277027results_rde_cdm_4_169605982results_rde_cdm_1_725916056results_rde_cdm_2_770882043


Price: Free

Chose your options before adding the service to the cart ?
*

Research, Development, Engineering
Project management
Are you world class at managing your Research, Development or Engineering projects?
Focused and in-depth evaluation based on best practices
Scope: Research, Development, Engineering focused on Project management
Industries: All industries with Research, Development, Engineering activities
Specific analysis:
  • Lean development drivers
  • Research and Development actionable drivers

Are you world class at managing your Research, Development or Engineering projects?

Number of best practice evaluations 48
Time to complete 45 - 90 mns
Multi Level analysis  
 
Optional anonymity  
 
Consolidated results  
 
Specific features
  • Research and Development actionable drivers
  • Lean Development
Languages available

English ?

Focused assessment of the Research, Development or Engineering function’s, focused on the activities related to project management based. The assessment is based on Research and Development best practices looking in depth at:


  • Customer Management
  • Research and Development Operations : development process, resource allocation, software and tools
  • Project activity Management – broken down further looking at:
    • Planning
    • Project Team
    • Project steering
    • Tools and templates

Some of these areas may go beyond pure project management, and be part of the overall 'central' Research and Development process; however, that depends on your own organisation so we include these areas. 

The assessment provides multi-level analysis on several axis: by individual or consolidated area of best practice, by organisation dimension (geographical location, department, hierarchical level or any other dimension depending on your organisation), by evaluator, by action drivers (Wevalgo model), by Lean development drivers

The results are provided in a concise and detailed (if option selected) report with multiple graphs and allowing you to clearly identify areas of improvement as well as perform internal or even external* benchmarks

 


Scope and industries

This service is tailored for:
  • all Research, Development or Engineering functions, across all industries (the assessment adapts itself to specific industries with conditional questions)
  • with a focus on the project related activities 


Who For

Research and Development manager, Engineering manager, manager of the project management competency center, internal consultants (eg continuous improvement team) or external consultants

Highlights

48 practices evaluations on Research, Development or Engineering project related activities, structured in 3 categories and 11 areas, enabling to identify strengths and improvement opportunities in a detailed and a consolidated view

  • Identification of strengths and improvement opportunities according to a WeValGo organisational model enabling to define improvement actions along consistent drivers (strategy and assets, organisation and governance, Leadership and people, process and rules, effective implementation)
  • Identification of Lean Development drivers
  • Normative questions with precise and auditable criteria (check with evaluated organisation data) enabling objective evaluation
  • Ability to create own organisation dimension (functional, geographical, hierarchical, or even particular organisational entity) enabling to compare evaluations of:
    • the same system (the practices, tools, organisation set up) used by or applied to different people
    • different systems used in the various organisational areas or units of the evaluated organisation (e.g. different geographical sites)
  • Evaluations can be kept anonymous


How long does it take?

Since it is a very detailed assessment, it takes around 45-90 mins to fill all the questions in a meaningful way. The questionnaire can be recorded and answered in several sessions

Result report

The result report is accessible once the evaluations are completed and contains graphical and text results. There are two options defining the level of analysis details displayed in the report.
Summary analysis: 
  • Answers by question with average performance for scored questions
  • Overall result scoring displayed in a chart
  • Analysis of the results difference between the top level questionnaire categories
  • Comments by evaluator
Detailed analysis:
  • Detailed answers by participant
  • Answers by question with average performance for scored questions
  • Answers by question and by participant with performance by participant for scored questions
  • Heatmap of the results, with category and subcategory performance levels
  • Performance levels by category with breakdown by subcategory
  • Performance levels by category and evaluator
  • Performance levels by category and organisational dimension (custom)
  • Performance levels by Wevalgo performance model
  • Performance levels by custom driver
  • Comments by evaluator


Number of participants

This is the maximum number of participants who can be invited and answer the questionnaire for a specific analysis or survey


Downloadable results

When chosen, enables to:
  • download the full result file in Excel
  • download chart data in Excel or print chart images
  • dowload results comments or specific requested data


Wevalgo project management InFocus model for Research, Development or Engineering


 
project management InFocus model for Research, Development or Engineering

A model built on practical experience of Project management best practices in Research, Development & Engineering

This model is built on the extensive management consulting experience of international Research, Development or Engineering (RDE) functions; each of its best practice has been observed in real conditions

48 best practices in a structured model

The model is structured in three categories and eleven areas of project management, each area includes 4 – 6 detailed best practices; this structure enables to identify strengths and improvement opportunities at different levels, from a high level and holistic view to a detailed view. Best practices are tested for the whole RDE scope for two main dimensions:

  • a functional dimension: how is the RDE and its functions or departments managed,
  • a project dimension: how are the product development projects managed.

The entire lifecycle of innovation is covered, from initial idea generation or customer needs collection to the market introduction and then the product portfolio analysis and rationalisation.

Normative questions

For each best practice, the question is very precise and the answers choices are almost systematically normed to facilitate the evaluation and mostly to avoid subjectivity

For example, some evaluations use a norm derived from CMMI; for others we have defined specific norms.

According to the practices, evaluation criteria used are, for example:

  • formalisation and detailed level,
  • quality and scope coverage,
  • people ownership and effective usage level,
  • roles and responsibilities clarity.

 

 

« Lean Development» Evaluation

 

Our whole model fully embeds the Lean principles; we have adapted the manufacturing Lean principles to Research, Development or Engineering as illustrated below

Even if « Lean Development» is almost everywhere in our model, the most simple and practical practices are tagged so that the result report displays how this First Level of Lean is implemented.

 

Lean waste

Our adaptation to Research, Development or Engineering

Defects

Misunderstood or unclear requirements, innefective design reviews, poor usage of technical processes or development stage-gates

Overproduction

Unnecessary requirements or design features, lack of standards, misalignment between new technologies and product or project requirements

Transportation

Unnecessary movement of people or information: inappropriate engineers location, project handovers

Waiting

Waiting for decision, ineffective planning

Inventory

Activities or expensive materials orders performed too much ahead due to ineffective plannings or resource allocation

Motion

Unnecessary activities to perform work: informal, unstructured, not integrated or duplicated data; unfit supporting tools

Over processing

Unnecessary validations, complex design or tests, too many reports

Skills & “brain waste”

Poor competence management and allocation, lack of training, lack of learning loops

 

Wevalgo Research and Development actionable drivers model 

Wevalgo organisation model 
In the detailed results report each practice is linked to one of the Research and Development Organisation model dimension so that the improvement drivers are also defined along these dimensions.
That enables to have even clearer and more actionable drivers to facilitate the improvement action plan definition.
The dimensions are:

  • Leadership & People: leadership capabilities, competences, social climate and value,
  • Strategy & assets: organisation strategy, technology, tangible and intangible assets,
  • Organisation: organisational structure, roles and responsibilities,
  • Steering: decision making, actions and performance indicators management,
  • Process: process, operating procedures and rules definition,
  • Tools: decision making tools, process support tools, software applications,
  • Implementation: effective implementation of strategy, organisation, steering, processes and tools,
 

Get instant and full access to reports as soon as evaluations are completed.

Gain multi-level analysis on several axis that can be explored in varied depths and dimensions through a user friendly results menu. A few samples of the results report are shown below

Heat map
Visualise the performance and their drivers at a glance across all categories and sub-categories
results-RDE-CDM-1.PNG
Results by organisational dimension
Compare performance across your own, relevant and entirely customisable organisational dimensions. For example by geographical locations, departments, hierarchical levels or any other you see fit. Zoom in to breakdowns by category, subcategory and/or question
results-RDE-CDM-2.PNG
Results by evaluator and category
You can see detailed results to better understand the causes of your organisation performance
results-RDE-CDM-3.PNG
Wevalgo excellence model Results
Get valuable insight into areas and axes of progress thanks to our embedded Wevalgo model. Other analytical models may be incorporated, see key concepts for more information)
results-RDE-CDM-4.png

To see a full example of report result,

We have expert consultants on hand to build their own evaluations, assist you and interpret your results. Select the Expert Review option on the purchase menu.

Key steps

after purchase and activation of the service, the key steps are the following:

  • Customisation of the service to the organisation by the service manager*
    • Service name, title and introduction for the participants invited
    • Selection of participants to the service
    • Definition of dimension to enable analysis by organisation dimension (optional)
    • Selection of anonymity option
  • Sending of an invitation link to the selected participants<
  • Evaluations performed by selected participants, on the Wevalgo web platform<
    • Participants connect to Wevalgo web site thanks to the link sent by the service manager
    • Participants answer on-line
    • The service manager can follow up the answers progress status

* The service manager is the person who purchased the service. 



Recommended participants

For this evaluation we recommend the following participants: 

  • Research, Development or Engineering functions manager or managers if different responsibilities
  • A few project managers or technical area managers
  • In case of several geographical sites, a selection of each site managers, enabling to compare the practices between the different sites


Technical Requirements

In order to use the services, the technical environment must comply with the requirements below. 

Browser requirements
  • Chrome: 62 and above
  • Internet Explorer: 11 and above
  • Safari: 10 and above
  • Firefox: 45 and above
  • Opera: 42 and above
Other
  • desktop or laptop usage, no mobile version
  • javascript enabled
  • Firewall authorising Wevalgo website 

If you choose to use other browsers or settings other than the ones listed here, the site's pages may not display properly, and you may encounter problems that Customer Service may not be able to resolve.

These technical requirements apply to all the participants of the service.