Research, Development, Engineering
Length: 45 - 90 mn
50 questions
R&D Project Management

Are your R&D or Engineering projects managed with world class practices?

Discover R&D or Engineering best practices, focused on project management 

45 to 49 best practices in the following areas

  • Customer Value : needs, requirements and change management
  • Performance monitoring
  • Supporting processes and tools: development process, resource allocation, software and tools
  • Project activity Management – broken down further looking at:
    • Planning
    • Project team
    • Project monitoring
    • Tools and templates
    • Subcontracting and supply

Some of these areas may go beyond pure project management, and be part of the overall 'central' Research and Development processes; since that depends on each organisation, we included them by default. 

Perform an in-depth evaluation of your R&D or Engineering

The best practices are in a structured questionnaire enabling evaluations of your practices compared to our benchmark

This service is tailored for

  • Functional scope: project management activities and supporting R&D processes for a Research, Development or Engineering center
  • Companies: all companies with Research, Development or Engineering functions larger than 20 people
  • Industries: all industries with product or processed product development or project Engineering (the assessment adapts itself to specific industries with conditional questions)

Who for

  • Research and Development manager, Engineering manager, manager of the project management competency center
  • Internal consultants (eg continuous improvement team) or external consultants
Get a detailed knowledge of R&D Project management best practices 
Get a detailed quantitative and qualitative assessment of your R&D compared to world Best Practices
Compare your results with best practices
Do yourself the evaluation and / or invite participants to do it
Perform an assessment in 45 to 90 minutes through 48 specific questions
Get an overview and a detailed analysis with a pyramid consolidation (question, subcategory, category, total)
Compare the results of each participant
Compare the performance by zone or level of your organization (country, site, function, hierarchical level)
Choose if participants are anonymous
Languages available to each participant: English 
Choose and validate the evaluation options

Unit price (excl. VAT):

What exactly are you buying?
Try out the diagnosis
Free and without obligation
View the content and test it for free. Access the questions and make an evaluation
Invite other participants or add options later
Detailed description
How to use


45 to 49 best practices structured into 4 categories and 13 domains, to obtain a synthetic and detailed vision for each of the project management domains.
Service allowing two uses:
  • learn good R&D or engineering practices
  • evaluate the R&D practices in relation to the best thanks to its online questionnaire functionality
The evaluation embeds Lean development / Lean engineering.
Number of questions / best practices depending on the type of industry; conditioned by a choice at the beginning of the questionnaire.

Project management best practices model in R&D and Engineering

 There are 45 to 49 Project management practices organised in the following structured model with 4 categories and 13 domains
project management InFocus model for Research, Development or Engineering
Customer value
  • Needs: how are needs captured and formalised?
  • Requirements management: how are its requirements used to define specifications, in what form and how are they monitored?
  • Change management: how are change requests formalized, analyzed, validated...?

Performance monitoring
  • Operational monitoring: which operational indicators are monitored?
  • Financial monitoring: which financial indicators are monitored? Is there a project Business Case?

Supporting processes and tools 
  • Development stages: Does the project follow a standard step-by-step development process?
  • Resourcing: how are project team members assigned to the project?
  • Software and tools: tools to support the project such as a project monitoring tool or technical data management tools?

Project activity management
  • Planning: Does the project use an operational planning with a breakdown into project tasks?
  • Project team: how is the project and its members organized?
  • Project monitoring: what mechanisms (meetings, risk monitoring, etc.) are used to monitor project activities and make decisions?
  • Tools and templates: Are there standard and adapted documentary templates for the project?
  • Subcontracting and procurement: how are the interfaces with subcontracting or suppliers managed?


Example of usage for evaluation

In-depth assessment of your Research, Development or Engineering, focused on Project management to enable you and your team to:

  • precisely understand project management strengths and weaknesses through quantified data allowing internal and even external comparisons
  • define new project management ways of working, trainings, or other improvement actions
  • define targeted areas to monitor closely on project management
  • easily compare improvements and progress at different points in time

Specific analyses when used for evaluation

An assessment against best practices

  • Each question is an assessment of a practice against the best practice identified by our experts
  • The result allows immediate visualization with a score from 0 to 100, 100 being the best practice score; this regardless of the question format

    • A normative evaluation

      For each best practice, the question is very precise and the choices of answers are almost systematically standardized to facilitate evaluation and especially to avoid subjectivity

      Some evaluations use a standard derived from the CMMI, others use a specific Wevalgo standard. Depending on practice, the evaluation criteria used are, for example:

      • quality and completeness of application
      • clarity of roles and responsibilities
      • ownership by people and effective level of use
      • formalisation and level of detail

        • A synthetic and detailed structured evaluation

          • The evaluation model is structured in a 'Pyramidal' way to provide both a synthetic and detailed analysis
          • There are four levels of consolidation; individual practice (question), sub-category, category, general total


              « Lean Development» Evaluation

              Our whole model fully embeds the Lean principles; we have adapted the manufacturing Lean principles to Research, Development or Engineering as illustrated below

              Even if « Lean Development» is almost everywhere in our model, the most simple and practical practices are tagged so that the result report displays how this First Level of Lean is implemented.


              Lean waste

              Our adaptation to Research, Development or Engineering


              Misunderstood or unclear requirements, innefective design reviews, poor usage of technical processes or development stage-gates


              Unnecessary requirements or design features, lack of standards, misalignment between new technologies and product or project requirements


              Unnecessary movement of people or information: inappropriate engineers location, project handovers


              Waiting for decision, ineffective planning


              Activities or expensive materials orders performed too much ahead due to ineffective plannings or resource allocation


              Unnecessary activities to perform work: informal, unstructured, not integrated or duplicated data; unfit supporting tools

              Over processing

              Unnecessary validations, complex design or tests, too many reports

              Skills & “brain waste”

              Poor competence management and allocation, lack of training, lack of learning loops


              Wevalgo organisation model

              Wevalgo organisational modelThe evaluation defines the actionable drivers for the identified improvements enabling concret action plan definition:

              • Leadership & People: leadership capabilities, competences, social climate and values
              • Strategy & assets: organisation strategy, technology, tangible and intangible assets
              • Organisation: organisational structure, roles and responsibilities
              • Steering: decision making, actions and performance indicators management
              • Process: process, operating procedures and rules definition
              • Tools: decision making tools, process support tools, software applications
              • Implementation: effective implementation of strategy, organisation, steering, processes and tools

Key steps

The best practices are accessible directly after purchase. In case of usage to evaluate the practices the following steps apply

  1. The service manager* can directly carry out an evaluation by answering the questionnaire (optional); steps 2 to 4 are optional depending on whether he/she wishes to invite other participants to carry out the evaluation
    • If he/she wishes to invite other participants to conduct the evaluation, he/she proceeds to steps 2 to 5
    • Otherwise he/she goes directly to step 5 of viewing the results
    • He/she can still do his evaluation after the participants if he did not do it at the beginning
  2. Customisation of the service to the organisation by the service manager*
    • Service name, title and introduction for the participants invited
    • Selection of participants to the service
    • Definition of dimension to enable analysis by organisation dimension (optional)
    • Selection of anonymity option
  3. Sending of an invitation link to the selected participants
  4. Evaluations performed by selected participants, on the Wevalgo web platform
    • Participants connect to Wevalgo web site thanks to the link sent by the service manager
    • Participants answer on-line
    • The service manager can follow up the answers progress status
  5. Results available at the end of evaluations on the Wevalgo website

* The service manager is the person who purchased the service. 

Recommended participants for an evaluation

For the evaluation we recommend the following participants: 

  • Research, Development or Engineering manager
  • A few project managers and the head of project management (if existing)
  • A few technical area managers
  • A selection of a few people outside of Research, Development or Engineering : sales and marketing, purchasing, manufacturing manager...
  • In case of several geographical sites, a selection of site top manager and project managers, enabling to compare the practices between the different sites.

Get instant and full access to reports as soon as evaluations are completed.

Gain multi-level analysis on several axis that can be explored in varied depths and dimensions through a user friendly results menu. A few samples of the results report are shown below

Heat map
Visualise the performance and their drivers at a glance across all categories and sub-categories
Results by organisational dimension
Compare performance across your own, relevant and entirely customisable organisational dimensions. For example by geographical locations, departments, hierarchical levels or any other you see fit. Zoom in to breakdowns by category, subcategory and/or question
Results by evaluator and category
You can see detailed results to better understand the causes of your organisation performance
Wevalgo excellence model Results
Get valuable insight into areas and axes of progress thanks to our embedded Wevalgo model. Other analytical models may be incorporated, see key concepts for more information)

We have expert consultants on hand to build their own evaluations, assist you and interpret your results. Select the Expert Review option on the purchase menu.

Related diagnosis
8 questionnaires