Research and Development strengths, weaknesses and team alignment

model_rde_ie_709407808
model_rde_ie_709407808resullt_1_1results_3_1results_2_1


Price: Free

Chose your options before adding the service to the cart ?
*

Research and Development in discrete product industries
Assess the priorities and team alignment on R&D strengths and weaknesses
Well-balanced snapshot of quantified assessment of practices with open questions
Scope: All Research and Development scope
Industries: Research and Development in process industries

Assess the priorities and team alignment on R&D strengths and weaknesses - Discrete products


Number of best practice evaluations 44
Time to complete 30 - 60 mins
Multi Level analysis  
 
Optional anonymity  
 
Consolidated results  
 
Specific features
  • Perception and best practice snapshot
Languages available

French, english ?

Insightful evaluation of the Research, Development or Engineering functions for discrete manufacturing industries . The evaluation provides a well-balanced snapshot of team perceptions with practices maturity; it enables to identify the areas to focus as well as the team alignment on the Research and Development management priorities.

  • Research and Development Strategy, standardisation policy and innovation porfolio
  • Customer Management including requirement management
  • Supplier Management
  • Key Research and Development processes: Change Management, Technical development, Project management
  • Research and Development improvement priorities and plan 

As with all Wevalgo tools multi-level analysis is possible on various axis: by individual, by organisation dimension (geographical location, department, hierarchical level or any other dimension depending on your organisation), by evaluator.

The results are provided in a concise report with multiple graphs and a question by question analysis which will allow to quickly identify areas of  focus as well as perform internal  comparisons. 

 


Scope and industries

This service is tailored for:

  • all Research, Development or Engineering function in discrete manufacturing industries (mechanicals, high-tech, electronics, consumer or industrial goods…) with product developed internally and sold into one or several customer markets
  • getting a balanced view of perception with objective evaluation and taking in account both the 'hard' and 'people' aspects.

 

Who For

R&D or Engineering manager, internal consultants (eg continuous improvement team) or external consultant

CEO, COO or other company senior manager wishing to evaluate perception and practices across their teams related to
R&D or Engineering


Highlights

44 questions structure across 4 categories and 16 areas (c. 2-3 questions per area) on the Research and development management of discrete product development. They enable to identify strengths and improvement opportunities based on high level best practices but capturing perception across the organization. Features include

  • A mix of normative questions with precise and auditable criteria with opinion seeking answers providing a degree of objective evaluation balanced with subjective perception.
  • Ability to create own organisation dimension (functional, geographical, hierarchical, or even particular organisational entity) enabling to compare evaluations:
  • of the same system (the practices, tools, organisation set up) used by or applied to different people
  • of different systems used in the various organisational areas or units of the evaluated organisation (e.g. different geographical sites)
  • Evaluations can be kept anonymous


How long does it take?

To answer all areas of the evaluation in a meaningful way, we would recommend anywhere between 30-60 mins. The questionnaire can be recorded and answered in several sessions.

Result report

The result report is accessible once the evaluations are completed and contains graphical and text results. There are two options defining the level of analysis details displayed in the report.
Summary analysis: 
  • Answers by question with average performance for scored questions
  • Comments by evaluator
Detailed analysis:
  • Detailed answers by participant
  • Answers by question with average performance for scored questions
  • Answers by question and by participant with performance by participant for scored questions
  • Comments by evaluator


Number of participants

This is the maximum number of participants who can be invited and answer the questionnaire for a specific analysis or survey


Downloadable results

When chosen, enables to:
  • download the full result file in Excel
  • download chart data in Excel or print chart images
  • dowload results comments or specific requested data


Wevalgo Insight model for Research, Development or Engineering in discrete products


 
Wevalgo Insight model for Research, Development

A model built on practical experience of best practices

As with all Wevalgo evaluation models, this model is built on the extensive management consulting experience of international Research, Development or Engineering (RDE) functions.

A structured and insightful Research and Development Evaluation Model 

The model is structured in 4 categories and 16 areas on the discrete product development full cycle; each area includes 2 – 3 assessments in a mix of normative and opinion-based formats. It enables to gather a quick, yet informed and constructive snapshot of the function. It is inspired and founded on best practices, so the evaluation provides a meaningful diagnostic from which further areas of focus can be easily determined or to track progress of implemented changes.

 

The areas of diagnosis have been carefully designed to shine a light on the lifecycle of innovation and discrete product development, from initial idea generation or customer needs collection to the market introduction.

Normative questions

Normative questions are used to compare a practice to world best practices; they are very precise and the answers choices are almost systematically normed to facilitate the evaluation and mostly to avoid subjectivity

For example, some evaluations use a norm derived from CMMI; for others we have defined specific norms.

According to the practices, evaluation criteria used are, for example:

  • formalisation and detailed level,
  • quality and scope coverage,
  • people ownership and effective usage level,
  • roles and responsibilities clarity.

 The number of normative questions is more limited than in our Comprehensive Dive assessments since the duration of this assessment is limited intentionnaly

 

Opinion based questions

Some questions are open, with free-text, while others ask evaluations based on the opinion of the participants (without being 'normative'). 
This is entirely intentional since one of the objectives of the Insight Evaluation is to identify what is the perception of the key people involved. This enables to identify the participants alignment and own priorities. In turn, this will facilitate their buy-in of any subsequent action plan based on the results of this assessment.

Get instant and full access to reports as soon as evaluations are completed.

Gain multi-level analysis on several axis that can be explored in varied depths and dimensions through a user friendly results menu. A few samples of the results report are shown below

Results by category
These results give the top level overview of evaluated practices maturity
Result-2-v3.png
Results by organisational dimension
Compare performance across your own, relevant and entirely customisable organisational dimensions. For example by geographical locations, departments, hierarchical levels or any other you see fit. Zoom in to breakdowns by category, subcategory and/or question
Result-3v2.png
Results by evaluator and category
You can see detailed results to better understand the causes of your organisation performance
Result-4-v2.png

To see a full example of report result,

We have expert consultants on hand to build their own evaluations, assist you and interpret your results. Select the Expert Review option on the purchase menu.

Key steps

after purchase and activation of the service, the key steps are the following:

  • Customisation of the service to the organisation by the service manager*
    • Service name, title and introduction for the participants invited
    • Selection of participants to the service
    • Definition of dimension to enable analysis by organisation dimension (optional)
    • Selection of anonymity option
  • Sending of an invitation link to the selected participants<
  • Evaluations performed by selected participants, on the Wevalgo web platform<
    • Participants connect to Wevalgo web site thanks to the link sent by the service manager
    • Participants answer on-line
    • The service manager can follow up the answers progress status

* The service manager is the person who purchased the service. 



Recommended participants

For this evaluation we recommend the following participants: 

  • Research, Development or Engineering functions manager or managers if different responsibilities
  • A few project managers or technical area managers
  • In case of several geographical sites, a selection of each site managers, enabling to compare the practices between the different sites


Technical Requirements

In order to use the services, the technical environment must comply with the requirements below. 

Browser requirements
  • Chrome: 62 and above
  • Internet Explorer: 11 and above
  • Safari: 10 and above
  • Firefox: 45 and above
  • Opera: 42 and above
Other
  • desktop or laptop usage, no mobile version
  • javascript enabled
  • Firewall authorising Wevalgo website 

If you choose to use other browsers or settings other than the ones listed here, the site's pages may not display properly, and you may encounter problems that Customer Service may not be able to resolve.

These technical requirements apply to all the participants of the service.