Research & Development
Duration: 90 - 150 mn
110 questions
Scientific R&D management

Does your scientific research and development use the best practices in the industry? Do a full evaluation.

For what
For who
Strengths
Options selection
Choose and validate the evaluation options

 
 
 
 
 
*
Unit price (excl. VAT):


What exactly are you buying?
model-1_2
Try out the diagnosis
Free and without obligation
View the content and test it for free. Access the questions and make an evaluation
Invite other participants or add options later
Detailed description
Specifics
How to use
Report

Highlights

More than 110 best practices about Research and Development.

All R&D is included, from fundamental research or understanding of customer needs to market launch (or in your plants for process development), as well as the management of the existing product portfolio.

Two uses are possible :

  • learn R&D product development management best practices
  • evaluate the R&D practices in relation to the best thanks to the online questionnaire functionality
The evaluation embeds "Lean Development" practices.


Scientific Research and Development best practice model

Scientific Research and Development Evaluation Questionnaire

Brief description

The model includes 110 best practices structured in 7 categories and 23 areas as summarised below.


Strategy and roadmap

  • R&D strategy: what does the strategy contain and how is it defined? What are the strategic objectives and how to achieve them? Is "open innovation" used or considered?
  • Customer and scientific roadmaps: are these two roadmaps defined and how? What are the exchanges between R&D and sales or marketing departments? Are the roadmaps consistent with each other?
  • Standardisation: are scientific methods standardised (characterisation, formulation, solution preparation, etc.)


R&D steering
  • Financial and strategic management: how are costs, subsidies or taxes monitored? Are the development costs per product tracked? Is the volume of activity by'strategic category' monitored?
  • Activity management: how are non-financial indicators monitored and do they have clearly defined objectives?


Scientific portfolio
  • Idea generation: how are internal ideas collected and managed until they become Research or Development projects?
  • Intellectual protection: what is the intellectual protection policy and how are patents filed? Is there any monitoring of competing patents in your market?
  • Scientific portfolio: how are all ideas, preliminary projects, feasibility studies, etc. managed and prioritized? Are there any performance indicators? What is the coordination with individual project reviews?
  • Product portfolio: Are existing products monitored over their life cycle and how does this affect the upstream scientific portfolio?


Customer satisfaction
  • Customer needs: how are customer needs identified, tested and characterized?
  • Specification management: how are these requirements described in functional or performance terms and prioritized? How can we ensure that they are respected throughout the development cycle?
  • Change requests: how are possible change requests from the customer (needs) or from within (changes in wording, processes, etc.) handled?



Project management
  • Team: Is there a formal team with clear roles and responsibilities? To whom do its members report during development and which physical location?
  • Project steering: What are the methodologies for managing Research and Development projects and how are they applied (indicators, monitoring meetings, risk reduction, etc.)?
  • Planning & scheduling: What are the planning formats and what is their real use? Are resource requirements being monitored? Is there a progress indicator?
  • Subcontracting and supply: how are suppliers or contractors managed and by whom? How do purchasing and R&D coordinate?



Organisation and skills
  • Structure: how is R&D organised geographically (in case of several locations) and locally, and are the associated responsibilities clear at each level? Are resources performing similar tasks grouped together? Is there an end-to-end responsibility for the development cycle?
  • Qualification and Leadership: Are the technical, managerial and leadership skills at the right level in a good social climate?
  • Skills and knowledge: Skills management and development, knowledge management and sharing
  • Relationship network: How are external relations (laboratories, suppliers, etc.) managed and monitored within the framework of a defined policy?



Methodologies and support
  • Scientific process: are the methodologies and procedures for scientific work (characterization, formulation, etc.) defined and applied? Are there any peer reviews?
  • Development process: how are the different stages of research or development defined and managed?
  • Resource allocation: how is resource allocation managed? Do consolidated visions of current needs and use exist?
  • Software and tools : is the R&D supported with the right tools and systems?
 

Example of usage for evaluation:

Perform an in-depth evaluation of your Scientific Research and Development in detail:
  • identify the levers to provide your customers with more efficient products, with a less costly and faster Research and Development cycle.
  • compare your performance with the world's best R&D practices
  • define a progress plan with actionable levers using our Wevalgo model of organizational levers

Specific analyses when used for evaluation

An assessment against best practices

  • Each question is an assessment of a practice against the best practice identified by our experts
  • The result allows immediate visualization with a score from 0 to 100, 100 being the best practice score; this regardless of the question format

    • A normative evaluation

      For each best practice, the question is very precise and the choices of answers are almost systematically standardized to facilitate evaluation and especially to avoid subjectivity

      Some evaluations use a standard derived from the CMMI, others use a specific Wevalgo standard. Depending on practice, the evaluation criteria used are, for example:

      • quality and completeness of application
      • clarity of roles and responsibilities
      • ownership by people and effective level of use
      • formalisation and level of detail

        • A synthetic and detailed structured evaluation

          • The evaluation model is structured in a 'Pyramidal' way to provide both a synthetic and detailed analysis
          • There are four levels of consolidation; individual practice (question), sub-category, category, general total


            • Functional and project dimensions evaluation

              The evaluation is performed along two dimensions to better understand your performance
              • a functional dimension: how are the Research and Development functions or departments managed,
              • a project dimension: how are the product development 'projects' managed.

              The entire lifecycle of innovation is covered, from initial idea generation or customer needs collection to the market introduction and then the product portfolio analysis and rationalisation.



              « Lean development» Evaluation 

              Our whole model fully embeds the Lean principles; we have adapted the manufacturing Lean principles to Research, Development or Engineering as illustrated below

              Even if « Lean Development» is almost everywhere in our model and best practices, the questions with the most practical Lean practices are tagged so that the result report displays how this First Level of Lean is implemented.

               

              Lean waste

              Our adaptation to Scientific Research and Development

              Defects

              Misunderstood or unclear requirements, innefective design reviews, poor usage of scientific methodologies or development stage-gates

              Overproduction

              Unnecessary requirements or design features, lack of standards, misalignment between new scientific discoveries and product requirements

              Transportation

              Unnecessary movement of people or information: inappropriate scientists or techniciances location, project handovers

              Waiting

              Waiting for decision, ineffective planning

              Inventory

              Activities or expensive materials orders performed too much ahead due to ineffective plannings or resource allocation

              Motion

              Unnecessary activities to perform work: informal, unstructured, not integrated or duplicated data; unfit supporting tools

              Over processing

              Unnecessary validations, complex design or tests, too many reports

              Skills & “brain waste”

              Poor competence management and allocation, lack of training, lack of learning loops

               

              Wevalgo organisation model

              Wevalgo organisational modelThe evaluation defines the actionable drivers for the identified improvements enabling concret action plan definition:

              • Leadership & People: leadership capabilities, competences, social climate and values
              • Strategy & assets: organisation strategy, technology, tangible and intangible assets
              • Organisation: organisational structure, roles and responsibilities
              • Steering: decision making, actions and performance indicators management
              • Process: process, operating procedures and rules definition
              • Tools: decision making tools, process support tools, software applications
              • Implementation: effective implementation of strategy, organisation, steering, processes and tools

Key steps

The best practices are accessible directly after purchase. In case of usage to evaluate the practices the following steps apply

  1. The service manager* can directly carry out an evaluation by answering the questionnaire (optional); steps 2 to 4 are optional depending on whether he/she wishes to invite other participants to carry out the evaluation
    • If he/she wishes to invite other participants to conduct the evaluation, he/she proceeds to steps 2 to 5
    • Otherwise he/she goes directly to step 5 of viewing the results
    • He/she can still do his evaluation after the participants if he did not do it at the beginning
  2. Customisation of the service to the organisation by the service manager*
    • Service name, title and introduction for the participants invited
    • Selection of participants to the service
    • Definition of dimension to enable analysis by organisation dimension (optional)
    • Selection of anonymity option
  3. Sending of an invitation link to the selected participants
  4. Evaluations performed by selected participants, on the Wevalgo web platform
    • Participants connect to Wevalgo web site thanks to the link sent by the service manager
    • Participants answer on-line
    • The service manager can follow up the answers progress status
  5. Results available at the end of evaluations on the Wevalgo website

* The service manager is the person who purchased the service. 



Recommended participants for an evaluation

For the evaluation we recommend the following participants: 

  • Research and Development top management team
  • A few development project managers
  • A few scientific area managers
  • A selection of a few people outside of Research, Development: sales and marketing, purchasing, manufacturing manager...
  • In case of several geographical sites, a selection of site managers, enabling to compare the practices between the different sites.


Technical Requirements

In order to use the services, the technical environment must comply with the requirements below. 

Browser requirements
  • Chrome: 62 and above
  • Internet Explorer: 11 and above
  • Safari: 10 and above
  • Firefox: 45 and above
  • Opera: 42 and above
Other
  • desktop or laptop usage, no mobile version
  • javascript enabled
  • Firewall authorising Wevalgo website 

If you choose to use other browsers or settings other than the ones listed here, the site's pages may not display properly, and you may encounter problems that Customer Service may not be able to resolve.

These technical requirements apply to all the participants of the service.


Get instant and full access to reports as soon as evaluations are completed.

Gain multi-level analysis on several axis that can be explored in varied depths and dimensions through a user friendly results menu. A few samples of the results report are shown below

Heat map
Visualise the performance and their drivers at a glance across all categories and sub-categories
results-RDE-CDM-1.PNG
Results by organisational dimension
Compare performance across your own, relevant and entirely customisable organisational dimensions. For example by geographical locations, departments, hierarchical levels or any other you see fit. Zoom in to breakdowns by category, subcategory and/or question
results-RDE-CDM-2.PNG
Results by evaluator and category
You can see detailed results to better understand the causes of your organisation performance
results-RDE-CDM-3.PNG
Wevalgo excellence model Results
Get valuable insight into areas and axes of progress thanks to our embedded Wevalgo model. Other analytical models may be incorporated, see key concepts for more information)
results-RDE-CDM-4.png

To see a full example of report result,


We have expert consultants on hand to build their own evaluations, assist you and interpret your results. Select the Expert Review option on the purchase menu.



Related diagnosis
4 questionnaires