SlideShare une entreprise Scribd logo
1  sur  89
Télécharger pour lire hors ligne
Yann-Gaël Guéhéneuc
Département de génie informatique et de génie logiciel
This work is licensed under a Creative
Commons Attribution-NonCommercial-
ShareAlike 3.0 Unported License
NSERC
DG Advices
yann-gael.gueheneuc@polytmtl.ca
Version 0.5
2013/07/07
2/89
Questions: I welcome them all at
yann-gael.gueheneuc@polymtl.ca
3/89
Disclaimer: I cannot be held responsible for
the failure or success of your applications,
were you to follow or not these advices
4/89
NSERC DG Advices
 Each NSERC DG application is evaluated
according to 4 criteria and 6 merit indicators
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Cost of research
5/89
NSERC DG Advices
 Each NSERC DG application is evaluated
according to 4 criteria and 6 merit indicators
– Exceptional
– Outstanding
– Very strong
– Strong
– Moderate
– Insufficient
6/89
NSERC DG Advices
 How are these criteria rated by the reviewers
using the indicators?
 How to ease the reviewers’ jobs?
And, how to be possibly more successful?
7/89
Outline
 Process in a nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
 Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
 Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
 Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
 Conclusion
 Further readings
8/89
Outline
 Process in a nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
 Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
 Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
 Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
 Conclusion
 Further readings
9/89
Process in a Nutshell
 From the candidate’s point of view
– August 1st: submission of Form 180
– November 1st: final submission of Forms 100,
101, and publications
– March/April: announcement of the results
10/89
Process in a Nutshell
 From the internal reviewer’s point of view
– Two main parts
• Off-line work, e-mails/readings
• Competition week in Ottawa
11/89
Process in a Nutshell
 Off-line work
– August 27th: reception of all the submissions
• In 2012, 322 submissions
– September 7th: ratings (expertise levels and
conflicts) of all the submissions
• High, Medium, Low, Very Low, Conflict, X (language)
– September 24th: final choice of the 1st internal
reviewers for each applications
• In 2012, 14 applications as 1st internal reviewer, 15
as 2nd internal reviewer, 17 as reader = 46
12/89
Process in a Nutshell
 Off-line work
– October 5th: choice by the 1st internal reviewer of
5 external referees
• In 2012, 14 applications = 70
• May include referees suggested by the candidate but
may also replace all of them
– October 22nd: ratings of applications from other
evaluation groups
• In 2012, 1 application
13/89
Process in a Nutshell
 Off-line work
– Early December: final list of readings
• In 2012, 47 applications
– January/February: reception of tge reports from
the external referees
• In 2012, 123 reports
– February 18th to 22nd: competition week in
Ottawa during which each application is
discussed and rated
14/89
Process in a Nutshell
 Off-line work
– In 2012 (and I suspect every year), a lot of work!
• 322 submissions
• 47 applications (including joint publications)
• 70 referees
• 123 referee reports
15/89
Process in a Nutshell
 Off-line work
– In 2012 (and I suspect every year), a lot of work!
• 322 submissions
• 47 applications (including joint publications)
• 70 referees
• 123 referee reports
Make it easier
for the reviewers
16/89
Process in a Nutshell
 Competition week
– February 18th to 22nd: competition week in
Ottawa during which each application is
discussed and rated
– 5 days
• In 2012 (and I suspect every year), very intense,
demanding, and tiring
17/89
Process in a Nutshell
 Competition day
– Starts at 8:30am
– Divides into
• 31 15-minute slots
• 2 15-minute breaks
• 1 45-minute lunch
– Ends at 5:15pm
• If no deferred applications to re-discuss
– In 2012, 1 application
18/89
Process in a Nutshell
 Competition slot
– In a 15-minute slot, the ratings of an application
are chosen by the reviewers
– Or the application is “deferred”, to be re-
discussed at the end of the day
19/89
Process in a Nutshell
 Competition slot
– 1st internal reviewer gives ratings with
justifications, which must be facts in the Forms
– 2nd internal reviewers contrasts, supports, adds
missing facts from the Forms
– The readers complement or challenge ratings
given by 1st and 2nd internal reviewers, must be
supported by facts from the Forms
20/89
Process in a Nutshell
 Competition slot
– 1st internal reviewer gives ratings with
justifications, which must be facts in the Forms
• In 2012, a typical presentation follow this pattern
– Candidate: career, funding, visibility, publications, HPQ
record, planned training
– Proposal: context, lacks, characteristics (Incremental?
Applicable? Feasible?)
– External: summary of the referees' reviews, summary of the
provided contributions
then, the reviewer would give his ratings
21/89
Process in a Nutshell
 Competition slot
– 1st internal reviewer gives ratings with
justifications, which must be facts in the Forms
• In 2012, a typical presentation follow this pattern
– Candidate: career, funding, visibility, publications, HPQ
record, planned training
– Proposal: context, lacks, characteristics (Incremental?
Applicable? Feasible?)
– External: summary of the referees' reviews, summary of the
provided contributions
then, the reviewer would give his ratings
Not exactly the
NSERC criteria
22/89
Process in a Nutshell
 Competition slot
– Session chair keeps the time strictly
– Session chairs makes sure that any
discussion sticks to the facts
23/89
Process in a Nutshell
 Competition slot
– Ratings are anonymous
• Secret electronic vote
• Session chair announce results
– Ratings are consensual
• If reviewers/readers strongly disagree, the application
will be re-discussed at the end of the day
– In 2012, I did not see any strong debates: mostly 1st and 2nd
internal reviewers agreed, backed-up by the readers
– In 2012, some facts were sometimes highlighted and ratings
were changed accordingly
24/89
Process in a Nutshell
 Competition slot
– Any criteria rated as moderate or insufficient
receive comments from the committee, reflecting
the consensus of the reviewers (highly focused)
• In 2012, NSERC provided typical comments, for
example: “The applicant did not take advantage of the
available space in Form 100 to make a compelling
case about his/her most significant research
contributions. Given the lack of information, the EG
was unable to carry out a thorough assessment and
potentially recommend a higher rating.”
25/89
Outline
 Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
 Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
 Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
 Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
 Conclusion
 Further readings
26/89
Funding Decisions
 In a nutshell
– Each proposal is rated by the reviewers secretly
after the discussions
– The medians of the ratings are used for criteria
– For example
• Excellence of researcher: {S, S, M, M, M}, rating is M
• Merit of the proposal: {V, V, S, S, M}, rating is S
• Impact of HQP: {V, S, S, S, M}, rating is S
• The application rating is therefore {M, S, S}
27/89
Funding Decisions
 Bins
– The numeric “values” of the ratings are “added”
• For example, {M, S, S}  2+3+3 = 8
– The application is placed into one of 16 bins
– The bins are labelled A through to P and
correspond numerically to 18 down to 3
28/89
Funding Decisions
 Bins
– Bins A and P are uniquely mapped to {E, E, E}
and {I, I, I} while pther bins contain a mix of
numerically equivalent ratings, e.g., {V, S, M} is
in the same bin as {S, S, S} and {M, S, V}
• For example, the application rated {M, S, S} is in K
– Not all applications in a bin are funded: {S, S, S}
may be funded while {M, S, V} is not
• Because of the moderate indicator for the first criteria
– Cut-off point depends on year
29/89
Funding Decisions
 ER vs. ECR
– Candidates are divided into
• ER: established researchers, who already applied
(funded?) to NSERC DG
• ECR: early-career researchers, who apply to NSERC
DG for the first time
– ECR are funded one bin “lower” (better) than ER
30/89
Outline
 Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
 Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
 Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
 Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
 Conclusion
 Further readings
31/89
Criteria and Indicators
 “Values” of criteria
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Cost of research
32/89
Criteria and Indicators
 “Values” of criteria
– Excellence of the researcher
• Knowledge, expertise and experience
• Quality of contributions to, and impact on, research
areas in the NSE
• Importance of contributions
33/89
Criteria and Indicators
 “Values” of criteria
– Merit of the proposal
• Originality and innovation
• Significance and expected contributions to research;
potential for technological impact
• Clarity and scope of objectives
• Methodology and feasibility
• Extent to which the scope of the proposal addresses
all relevant issues
• Appropriateness of, and justification for, the budget
• Relationship to other sources of funds
34/89
Criteria and Indicators
 “Values” of criteria
– Merit of the proposal
• Originality and innovation
• Significance and expected contributions to research;
potential for technological impact
• Clarity and scope of objectives
• Methodology and feasibility
• Extent to which the scope of the proposal addresses
all relevant issues
• Appropriateness of, and justification for, the budget
• Relationship to other sources of funds
Not really important
35/89
Criteria and Indicators
 “Values” of criteria
– Merit of the proposal
• Originality and innovation
• Significance and expected contributions to research;
potential for technological impact
• Clarity and scope of objectives
• Methodology and feasibility
• Extent to which the scope of the proposal addresses
all relevant issues
• Appropriateness of, and justification for, the budget
• Relationship to other sources of funds
Not really important
Amounts of previous
grants (in particular
NSERC DG) should
be ignored
36/89
Criteria and Indicators
 “Values” of criteria
– Contribution to the training HQP
• Quality and impact of contributions
• Appropriateness of the proposal for the training of
HQP in the NSE
• Enhancement of training arising from a collaborative
or interdisciplinary environment, where applicable
37/89
Criteria and Indicators
 “Values” of criteria
– Cost of research
• Rationale
38/89
Criteria and Indicators
 “Values” of criteria
– Cost of research
• Rationale
Not really important
but you cannot have
more than what you
ask, no matter the merit
39/89
Criteria and Indicators
 “Meanings” of indicators
– Exceptional
– Outstanding
– Very strong
– Strong
– Moderate
– Insufficient
40/89
Criteria and Indicators
 “Meanings” of indicators
– Exceptional
• In 2012, I did not see any exceptional ratings
– Outstanding
– Very strong
– Strong
– Moderate
– Insufficient
Criteria and Indicators
42/89
Criteria and Indicators
 NSERC rating form
– NSERC provides a 2-page rating form
• In 2012, I found that this rating form does not follow
the presentation pattern during the competition slot
because it spreads information
• In 2012, however, each application was obviously
rated according to the 4 criteria and 6 indicators
43/89
Criteria and Indicators
 NSERC rating form (1/2)
Applicant: University: Application I.D.:
Applicant Status:
Title of Application:
Evaluation criteria (See Section 6 of Peer Review Manual for complete details)
Excellence of researcher(s)
Exceptional Outstanding Very Strong
Strong oderate Insufficient
Knowledge, expertise and experience
Quality of contributions to, and impact on, research areas in the NSE
Importance of contributions
For Team applications: complementarity of expertise between members and synergy
Rationale for rating:
Merit of the proposal
Exceptional Outstanding Very Strong
Strong Moderate Insufficient
Originality and innovation
Significance and expected contributions to research; potential for technological impact
Clarity and scope of objectives
Methodology and feasibility
Extent to which the scope of the proposal addresses all relevant issues
Appropriateness of, and justification for, the budget
Relationship to other sources of funds
Rationale for rating:
Contributions to training of highly qualified personnel
Exceptional Outstanding Very Strong
Strong Moderate Insufficient
Quality and impact of contributions during the last six years
Appropriateness of the proposal for the training of HQP in the NSE
Enhancement of training arising from a collaborative or interdisciplinary environment,
where applicable
Rationale for rating:
Cost of research (relative cost of the proposed research program as compared to the norms for the field) Low Normal High
Rationale for Cost of Research:
44/89
Criteria and Indicators
 NSERC rating form (2/2)
Other comments (e.g., duration should exceptionally be less than norm, special circumstances, quality of samples of
contributions provided, environmental impact, ethical concerns. Your Program Officer should be notified accordingly):
Summary of assessment by external referees (please highlight any comments that would be deemed inappropriate for the
Evaluation Group to consider in their discussions):
Points for message to applicant (if rating of “Moderate” or “Insufficient” on any criterion or duration shorter than norm):
Discovery Accelerator Supplement (DAS):
 Regular DAS: Yes No
 DAS in Targeted Areas : Yes No
Rationale for DAS Recommendation:
45/89
Criteria and Indicators
 My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
46/89
Criteria and Indicators
 My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Researcher
47/89
Criteria and Indicators
 My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Researcher
Proposal
48/89
Criteria and Indicators
 My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Researcher
Proposal
HPQ
49/89
Outline
 Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
 Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
 Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
 Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
 Conclusion
 Further readings
50/89
Advices
 Introduction
– Reviewers receive 2-3 dozens of applications
– Overall, upon firs review, the quality is
impressive, thus generating a positive reaction
– The objective is to discriminate, however,
initiating a vigorous search for flaws
51/89
Advices
 Introduction
– Reviewers may perceive aspects of applications
as confusing, ambiguous, incomplete, or just not
compelling
– They will not give the benefits of the doubt
• In 2012, I witness some excellent researchers
receiving low ratings because of sloppiness in
their applications
52/89
Advices
 Introduction
– Reviewers will most likely “mine” the Forms 100,
101, and publications to make up their minds
regarding the 4 criteria
Make it easy for them to mine your applications!
53/89
Advices
 Introduction
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
54/89
Advices
 Introduction
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Form 100
55/89
Advices
 Introduction
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Form 100
Form 101
56/89
Advices
 Introduction
– Form 100
• Is used for two of the three important criteria
– Form 101
• Is used for the merit of the proposal mostly
57/89
Excellence of the Researcher
 Form 100
– Career: pages 1-2
– Funding: pages 3-…
– Visibility
• “Other Evidence of Impact and Contributions”
• Awards, chairing, editorship, organisation, seminars:
anything showing external acknowledgments
– Publications
• “Other Research Contributions”
• Quantity and quality
58/89
Excellence of the Researcher
 Form 101
– Essentially nothing
 Contributions
– Important for the experts, should be explained
for the non-experts in Form 100, “Most
Significant Contributions to Research”
 External reviewers
– Confirm/contrast findings in the Form 100, 101,
and the publications
59/89
Merit of the Proposal
 Form 101
– Context
• Is the application well positioned?
– Lacks
• Any problems not discussed?
– Incremental?
• How innovative?
– Applicable?
• Usefulness, even remote?
– Feasible?
• Methodology
60/89
Merit of the Proposal
 Form 101
– Reviewers may also look for
• Knowledge of the key issues (background)
• Originality and innovation (background limits)
• Clarity of scope and objectives
• Methodology
– Trust/confidence that you can do work
• Significance
61/89
Merit of the Proposal
 Form 100
– Essentially nothing
 Contributions
– Essentially nothing
 External reviewers
– Confirm/contrast findings in the Form 101
62/89
Contribution to the Training of HQP
 Form 100
– HPQ Record
• Pages 1, 5-…
• Make it consistent, report what the students do now
– Planned training
• “Contributions to the Training of Highly Qualified
Personnel”
63/89
Contribution to the Training of HQP
 Form 101
– Last part on “Contribution to the Training of
Highly Qualified Personnel (HQP)”
 Contributions
– Essentially nothing
 External reviewers
– Confirm/contrast findings in the Form 100, 101
64/89
Forms 100 and 101
 In 2012, in general, my reading/rating was
made easier when the application carefully
followed the NSERC suggested
guidelines/templates
– Any missing/misplace/different category was
disruptive and sent a bad signal: “I want to do
differently from others”
65/89
Form 100
 In 2012, here is what I particularly looked at
“TRAINING OF HIGHLY QUALIFIED PERSONNEL”
– Total numbers of students to know if the
candidate is actively supervising students
– Any increase/decrease
– Any inflation of the numbers just before the
submission, it sends a bad message
66/89
Form 100
 In 2012, here is what I particularly looked at
“ACADEMIC, RESEARCH AND INDUSTRIAL EXPERIENCE”
– Current and past positions
– Any industrial experience
– Experiences that could explain the absence of
publications, of HQP
67/89
Form 100
 In 2012, here is what I particularly looked at
“RESEARCH SUPPORT”
– The candidate, given experience, having
appropriate funds?
• NSERC and industry are most important
• Others can help but should be explained, possibly in
the contributions (F100) or in the budget (F101)
• Amounts are not so important but give a signal
• They shows the candidate’s willingness, relevance
68/89
Form 100
 In 2012, here is what I particularly looked at
“HIGHLY QUALIFIED PERSONNEL (HQP)”
– The candidate’s contribution to the training of
past/current HQP
• Titles of the projects should be meaningful and
focused; unrelated titles send a bad signal
• “Present positions” are important and show that the
candidate follows his/her students
• The degree obtained/pursued by the HQP (Ph.D.?
M.Sc.? B.Sc.? Others?)
69/89
Form 100
 In 2012, here is what I particularly looked at
“MOST SIGNIFICANT CONTRIBUTIONS TO RESEARCH”
– Evidence of scientific contributions
• I ask myself “do I know this candidate?” If unknown, I
searched the places of publications to assess the
quality of the contributions
• References must better include where the papers
were published to ease the reviewer’s task
• The contributions should be related to the topic of the
current NSERC DG to show continuation
70/89
Form 100
 In 2012, here is what I particularly looked at
“ARTICLES IN REFEREED PUBLICATIONS”
– Evidence of scientific contributions
• Quality was first and foremost: I looked at the venues
and assessed their quality
– Acronyms must be explained, publisher names, years…
must be given, systematically
– Bibliographic metric values may be given; Google Scholar
citation counts is accepted; others metrics are discounted
• Quantity was a plus but, without quality, it sent a bad
signal: “I publish without focus”
71/89
Form 100
 In 2012, here is what I particularly looked at
“OTHER EVIDENCE OF IMPACT AND CONTRIBUTIONS”
– Evidence of external acknowledgments
• Anything that could help me confirm my impression
on the merit of the candidate: awards, chairing,
editorship, organisation, seminars
– Unknown venues just to “fill in” that part must be avoided
• Lack thereof was sending a bad signal: “I am not
involved in the community” or “The community does
not want me”
72/89
Form 101
 In 2012, here is what I particularly looked at
“TITLE OF THE PROPOSAL”
– The title, which must be relevant and accurate,
for a quick understanding (or not!) of what the
proposal is all about
73/89
Form 101
 In 2012, here is what I particularly looked at
“PROVIDE A MAXIMUM OF 10 KEY WORDS…”
– The keywords, which must be relevant and
accurate, to get an deeper idea of what the
proposal is all about
74/89
Form 101
 In 2012, here is what I particularly looked at
“TOTAL AMOUNT REQUESTED FROM NSERC”
– The total amount requested, which could raise
my curiosity if “too” high (say, more than 70 K)
or “too” low (say, less than 30 K)
• I would then read the budget to understand the
detailed numbers
75/89
Form 101
 In 2012, here is what I particularly looked at
“SUMMARY OF PROPOSAL FOR PUBLIC RELEASE”
– This very important part
• In my domain, would help me understand the
application and form an idea of what to expect
• Outside my domain, would help me understand the
application as a whole
• I would look for clear objectives, expected results and
their significance, and contributions to HQP
• Any typo or grammar error raised warning!
76/89
Form 101
 In 2012, here is what I particularly looked at
“TOTAL PROPOSED EXPENDITURES”
– The use of the money towards HQP
• “Most” of the money should go to fund HQP; I mean
at least 80%-90%
• But some money should be used for students’ travels
(not only for the candidate’s)
• I would also look for any “surprising” amounts in any
“surprising” category
77/89
Form 101
 In 2012, here is what I particularly looked at
“BUDGET JUSTIFICATION”
– The use of the money towards HQP
• It should include a table showing, per year, the use of the money
on students and possibly pictures of uncommon equipment
• Budget is not so important but errors or lack of justification sent
a bad signal
• Also, it is important to be consistent: 5 K are not enough to
equipped a whole lab. with computers…
• It should also explain any specific institutional rules or
complementary source of funds
• When naming conferences for travel, only name conferences
relevant to the proposal and possibly justify the choice
78/89
Form 101
 In 2012, here is what I particularly looked at
“RELATIONSHIP TO OTHER RESEARCH SUPPORT”
– Possible duplication of funds
• If there is a clear relation between two or more funds,
in that case, it should be clear how the money of the
NSERC DG will be used, e.g., students
79/89
Form 101
 In 2012, here is what I particularly looked at
“PROPOSAL”
– The context, objectives, expected results in the
first few lines or maybe an example to help me
understand what the proposal is all about
– A story
– Any missing/misplace/different category, which
was disruptive and sent a bad signal: “I want to
do differently from others”
80/89
Form 101
 In 2012, here is what I particularly looked at
“PROPOSAL”
– An up-to-date background
• Few recent references are interesting
• Some discussions of the references and of their
contributions and limitations is definitely needed
• If possible some clear, simple, and convincing
running examples illustrating the references limits
81/89
Form 101
 In 2012, here is what I particularly looked at
“PROPOSAL”
– The contrast with the background
• Then, the proposal should explain how it will improve
on the limits of the state-of-the-art or state-of-the-
practice and demonstrate its feasibility
• Then, it should also explain how it will go beyond
clear feasibility to show innovation
• It must be limited to 3 main objectives, with
milestones and students
82/89
Form 101
 In 2012, here is what I particularly looked at
“PROPOSAL”
– Clearly stated goal(s), challenge(s), expected
results and their significance, for each objective
• The proposal should not hide weaknesses and
challenges but explain them and the methodology to
lessen their impacts
• It should also explain how to evaluate the success/
failure of a goal and what happens “if…”
83/89
Form 101
 In 2012, here is what I particularly looked at
“PROPOSAL”
– The association of students with the objectives
• Possibly, the proposal should explain the recruiting of
students in the next section, “Contribution to HQP”
– Its possible significance
• On society, not on yourself or your small community
• Use of other, existing funds is “proof” of interest from
external, broader society
84/89
Form 101
 In 2012, here is what I particularly looked at
“CONTRIBUTION TO HQP”
– The context of training and its methodology
• It is a chance to explain how this application will
concretely be used to train students
• It is better if there are already some student names
• If the merit of the candidate is low and the candidate
boast too many students, it can be negative
85/89
Form 101
 In 2012, here is what I particularly looked at
“REFERENCES”
– Consistency
• References should be consistent and well-written,
again, any typo or grammar errors raised a warning!
86/89
Outline
 Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
 Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
 Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
 Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
 Conclusion
 Further readings
87/89
Conclusion
 Follow the guidelines / templates carefully
 Be simple, clear, straightforward
– Even with the weaknesses
 Do not forget anything but do not show off
either, explain, explain, explain
 Avoid at all costs typo / grammar errors
88/89
Outline
 Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
 Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
 Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
 Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
 Conclusion
 Further readings
89/89
Further Readings
 NSERC Discovery Grants – An Insiders View by
Larry W. Kostiuk (Co-Chair, Fluids GSC 1512)
 How to Succeed in the New NSERC Discovery
Grant Competition Model by Evangelos Milios, Nur
Zincir-Heywood, and Stavros Konstantinidis
 NSERC Discovery Grant Applications: Hints and
Insights by Jason P. Leboe-McGowan
 Advice on NSERC Discovery and RTI Applications
by Robert Bridson

Contenu connexe

Similaire à Advice for writing a NSERC Discovery grant application v0.5

BAETE_workshop_for_PEV_October_2017.pdf
BAETE_workshop_for_PEV_October_2017.pdfBAETE_workshop_for_PEV_October_2017.pdf
BAETE_workshop_for_PEV_October_2017.pdfASMZahidKausar
 
E success client-software-development
E success client-software-developmentE success client-software-development
E success client-software-developmentRiyaan Sharma
 
Tracking and Assessing Vocational Qualifications
Tracking and Assessing Vocational QualificationsTracking and Assessing Vocational Qualifications
Tracking and Assessing Vocational QualificationsJohn Gordon
 
ntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptxntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptxInnoversity1
 
ntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptxntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptxChancellor College
 
PAS: The Planning Quality Framework
PAS: The Planning Quality FrameworkPAS: The Planning Quality Framework
PAS: The Planning Quality FrameworkPAS_Team
 
Practical Issues of Public Tenders- A Consultant's Perspective
Practical Issues of Public Tenders- A Consultant's PerspectivePractical Issues of Public Tenders- A Consultant's Perspective
Practical Issues of Public Tenders- A Consultant's PerspectiveS.S.Afemikhe Consulting
 
BizTransSysTech_CareerEd_v1.0
BizTransSysTech_CareerEd_v1.0BizTransSysTech_CareerEd_v1.0
BizTransSysTech_CareerEd_v1.0BizTrans SysTech
 
Chapter One Procrument An Overview.pdf
Chapter One Procrument An Overview.pdfChapter One Procrument An Overview.pdf
Chapter One Procrument An Overview.pdfmekuannintdemeke
 
PA2557_SQM_Lecture1 - Course Introduction.pdf
PA2557_SQM_Lecture1 - Course Introduction.pdfPA2557_SQM_Lecture1 - Course Introduction.pdf
PA2557_SQM_Lecture1 - Course Introduction.pdfhulk smash
 
PLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSESPLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSESSurono Surono
 
Apprenticeship Standards - Team Forum Nov 17
Apprenticeship Standards - Team Forum Nov 17Apprenticeship Standards - Team Forum Nov 17
Apprenticeship Standards - Team Forum Nov 17Samuel Fosbery
 
6_how_to_write_an_ier_0.pdf
6_how_to_write_an_ier_0.pdf6_how_to_write_an_ier_0.pdf
6_how_to_write_an_ier_0.pdfjimmynelson21
 

Similaire à Advice for writing a NSERC Discovery grant application v0.5 (20)

BAETE_workshop_for_PEV_October_2017.pdf
BAETE_workshop_for_PEV_October_2017.pdfBAETE_workshop_for_PEV_October_2017.pdf
BAETE_workshop_for_PEV_October_2017.pdf
 
E success client-software-development
E success client-software-developmentE success client-software-development
E success client-software-development
 
03 Useful tips for applicants when writing a grant application
03 Useful tips for applicants when writing a grant application03 Useful tips for applicants when writing a grant application
03 Useful tips for applicants when writing a grant application
 
Tracking and Assessing Vocational Qualifications
Tracking and Assessing Vocational QualificationsTracking and Assessing Vocational Qualifications
Tracking and Assessing Vocational Qualifications
 
ntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptxntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptx
 
ntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptxntroduction to Project Management Level 4.pptx
ntroduction to Project Management Level 4.pptx
 
PAS: The Planning Quality Framework
PAS: The Planning Quality FrameworkPAS: The Planning Quality Framework
PAS: The Planning Quality Framework
 
Practical Issues of Public Tenders- A Consultant's Perspective
Practical Issues of Public Tenders- A Consultant's PerspectivePractical Issues of Public Tenders- A Consultant's Perspective
Practical Issues of Public Tenders- A Consultant's Perspective
 
PRCA Medallions: Bringing Home the Trophy
PRCA Medallions: Bringing Home the TrophyPRCA Medallions: Bringing Home the Trophy
PRCA Medallions: Bringing Home the Trophy
 
Chartered presentation by Austin Witney and Alex Garrard
Chartered presentation by Austin Witney and Alex GarrardChartered presentation by Austin Witney and Alex Garrard
Chartered presentation by Austin Witney and Alex Garrard
 
OBE pdf [Autosaved].ppt
OBE pdf [Autosaved].pptOBE pdf [Autosaved].ppt
OBE pdf [Autosaved].ppt
 
BizTransSysTech_CareerEd_v1.0
BizTransSysTech_CareerEd_v1.0BizTransSysTech_CareerEd_v1.0
BizTransSysTech_CareerEd_v1.0
 
Chapter One Procrument An Overview.pdf
Chapter One Procrument An Overview.pdfChapter One Procrument An Overview.pdf
Chapter One Procrument An Overview.pdf
 
PA2557_SQM_Lecture1 - Course Introduction.pdf
PA2557_SQM_Lecture1 - Course Introduction.pdfPA2557_SQM_Lecture1 - Course Introduction.pdf
PA2557_SQM_Lecture1 - Course Introduction.pdf
 
The HDR Examination Process at MQ
The HDR Examination Process at MQThe HDR Examination Process at MQ
The HDR Examination Process at MQ
 
PLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSESPLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSES
 
Apprenticeship Standards - Team Forum Nov 17
Apprenticeship Standards - Team Forum Nov 17Apprenticeship Standards - Team Forum Nov 17
Apprenticeship Standards - Team Forum Nov 17
 
PEEM 4.pptx
PEEM 4.pptxPEEM 4.pptx
PEEM 4.pptx
 
Pathways to become a Chartered Project Professional
Pathways to become a Chartered Project ProfessionalPathways to become a Chartered Project Professional
Pathways to become a Chartered Project Professional
 
6_how_to_write_an_ier_0.pdf
6_how_to_write_an_ier_0.pdf6_how_to_write_an_ier_0.pdf
6_how_to_write_an_ier_0.pdf
 

Plus de Yann-Gaël Guéhéneuc

Ptidej Architecture, Design, and Implementation in Action v2.1
Ptidej Architecture, Design, and Implementation in Action v2.1Ptidej Architecture, Design, and Implementation in Action v2.1
Ptidej Architecture, Design, and Implementation in Action v2.1Yann-Gaël Guéhéneuc
 
Evolution and Examples of Java Features, from Java 1.7 to Java 22
Evolution and Examples of Java Features, from Java 1.7 to Java 22Evolution and Examples of Java Features, from Java 1.7 to Java 22
Evolution and Examples of Java Features, from Java 1.7 to Java 22Yann-Gaël Guéhéneuc
 
Consequences and Principles of Software Quality v0.3
Consequences and Principles of Software Quality v0.3Consequences and Principles of Software Quality v0.3
Consequences and Principles of Software Quality v0.3Yann-Gaël Guéhéneuc
 
Some Pitfalls with Python and Their Possible Solutions v0.9
Some Pitfalls with Python and Their Possible Solutions v0.9Some Pitfalls with Python and Their Possible Solutions v0.9
Some Pitfalls with Python and Their Possible Solutions v0.9Yann-Gaël Guéhéneuc
 
An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...
An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...
An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...Yann-Gaël Guéhéneuc
 
An Explanation of the Halting Problem and Its Consequences
An Explanation of the Halting Problem and Its ConsequencesAn Explanation of the Halting Problem and Its Consequences
An Explanation of the Halting Problem and Its ConsequencesYann-Gaël Guéhéneuc
 
Informaticien(ne)s célèbres (v1.0.2, 19/02/20)
Informaticien(ne)s célèbres (v1.0.2, 19/02/20)Informaticien(ne)s célèbres (v1.0.2, 19/02/20)
Informaticien(ne)s célèbres (v1.0.2, 19/02/20)Yann-Gaël Guéhéneuc
 
On Java Generics, History, Use, Caveats v1.1
On Java Generics, History, Use, Caveats v1.1On Java Generics, History, Use, Caveats v1.1
On Java Generics, History, Use, Caveats v1.1Yann-Gaël Guéhéneuc
 
On Reflection in OO Programming Languages v1.6
On Reflection in OO Programming Languages v1.6On Reflection in OO Programming Languages v1.6
On Reflection in OO Programming Languages v1.6Yann-Gaël Guéhéneuc
 

Plus de Yann-Gaël Guéhéneuc (20)

Ptidej Architecture, Design, and Implementation in Action v2.1
Ptidej Architecture, Design, and Implementation in Action v2.1Ptidej Architecture, Design, and Implementation in Action v2.1
Ptidej Architecture, Design, and Implementation in Action v2.1
 
Evolution and Examples of Java Features, from Java 1.7 to Java 22
Evolution and Examples of Java Features, from Java 1.7 to Java 22Evolution and Examples of Java Features, from Java 1.7 to Java 22
Evolution and Examples of Java Features, from Java 1.7 to Java 22
 
Consequences and Principles of Software Quality v0.3
Consequences and Principles of Software Quality v0.3Consequences and Principles of Software Quality v0.3
Consequences and Principles of Software Quality v0.3
 
Some Pitfalls with Python and Their Possible Solutions v0.9
Some Pitfalls with Python and Their Possible Solutions v0.9Some Pitfalls with Python and Their Possible Solutions v0.9
Some Pitfalls with Python and Their Possible Solutions v0.9
 
An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...
An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...
An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...
 
An Explanation of the Halting Problem and Its Consequences
An Explanation of the Halting Problem and Its ConsequencesAn Explanation of the Halting Problem and Its Consequences
An Explanation of the Halting Problem and Its Consequences
 
Are CPUs VMs Like Any Others? v1.0
Are CPUs VMs Like Any Others? v1.0Are CPUs VMs Like Any Others? v1.0
Are CPUs VMs Like Any Others? v1.0
 
Informaticien(ne)s célèbres (v1.0.2, 19/02/20)
Informaticien(ne)s célèbres (v1.0.2, 19/02/20)Informaticien(ne)s célèbres (v1.0.2, 19/02/20)
Informaticien(ne)s célèbres (v1.0.2, 19/02/20)
 
Well-known Computer Scientists v1.0.2
Well-known Computer Scientists v1.0.2Well-known Computer Scientists v1.0.2
Well-known Computer Scientists v1.0.2
 
On Java Generics, History, Use, Caveats v1.1
On Java Generics, History, Use, Caveats v1.1On Java Generics, History, Use, Caveats v1.1
On Java Generics, History, Use, Caveats v1.1
 
On Reflection in OO Programming Languages v1.6
On Reflection in OO Programming Languages v1.6On Reflection in OO Programming Languages v1.6
On Reflection in OO Programming Languages v1.6
 
ICSOC'21
ICSOC'21ICSOC'21
ICSOC'21
 
Vissoft21.ppt
Vissoft21.pptVissoft21.ppt
Vissoft21.ppt
 
Service computation20.ppt
Service computation20.pptService computation20.ppt
Service computation20.ppt
 
Serp4 iot20.ppt
Serp4 iot20.pptSerp4 iot20.ppt
Serp4 iot20.ppt
 
Msr20.ppt
Msr20.pptMsr20.ppt
Msr20.ppt
 
Iwesep19.ppt
Iwesep19.pptIwesep19.ppt
Iwesep19.ppt
 
Icsoc20.ppt
Icsoc20.pptIcsoc20.ppt
Icsoc20.ppt
 
Icsoc18.ppt
Icsoc18.pptIcsoc18.ppt
Icsoc18.ppt
 
Icsm20.ppt
Icsm20.pptIcsm20.ppt
Icsm20.ppt
 

Dernier

Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...MyIntelliSource, Inc.
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...soniya singh
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsMehedi Hasan Shohan
 
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...Christina Lin
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdfWave PLM
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...aditisharan08
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxTier1 app
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - InfographicHr365.us smith
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfkalichargn70th171
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackVICTOR MAESTRE RAMIREZ
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyFrank van der Linden
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptkotipi9215
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 

Dernier (20)

Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software Solutions
 
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - Infographic
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStack
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The Ugly
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.ppt
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 

Advice for writing a NSERC Discovery grant application v0.5

  • 1. Yann-Gaël Guéhéneuc Département de génie informatique et de génie logiciel This work is licensed under a Creative Commons Attribution-NonCommercial- ShareAlike 3.0 Unported License NSERC DG Advices yann-gael.gueheneuc@polytmtl.ca Version 0.5 2013/07/07
  • 2. 2/89 Questions: I welcome them all at yann-gael.gueheneuc@polymtl.ca
  • 3. 3/89 Disclaimer: I cannot be held responsible for the failure or success of your applications, were you to follow or not these advices
  • 4. 4/89 NSERC DG Advices  Each NSERC DG application is evaluated according to 4 criteria and 6 merit indicators – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Cost of research
  • 5. 5/89 NSERC DG Advices  Each NSERC DG application is evaluated according to 4 criteria and 6 merit indicators – Exceptional – Outstanding – Very strong – Strong – Moderate – Insufficient
  • 6. 6/89 NSERC DG Advices  How are these criteria rated by the reviewers using the indicators?  How to ease the reviewers’ jobs? And, how to be possibly more successful?
  • 7. 7/89 Outline  Process in a nutshell – Candidate’s point of view – Internal reviewer’s point of view • Off-line work • Competition week  Funding decisions – In a nutshell – Bins – ER vs. ECR  Criteria and indicators – “Values” of criteria – “Meanings” of indicators – NSERC rating form – My own rating form  Advices – Introduction – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Form 100 – Form 101  Conclusion  Further readings
  • 8. 8/89 Outline  Process in a nutshell – Candidate’s point of view – Internal reviewer’s point of view • Off-line work • Competition week  Funding decisions – In a nutshell – Bins – ER vs. ECR  Criteria and indicators – “Values” of criteria – “Meanings” of indicators – NSERC rating form – My own rating form  Advices – Introduction – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Form 100 – Form 101  Conclusion  Further readings
  • 9. 9/89 Process in a Nutshell  From the candidate’s point of view – August 1st: submission of Form 180 – November 1st: final submission of Forms 100, 101, and publications – March/April: announcement of the results
  • 10. 10/89 Process in a Nutshell  From the internal reviewer’s point of view – Two main parts • Off-line work, e-mails/readings • Competition week in Ottawa
  • 11. 11/89 Process in a Nutshell  Off-line work – August 27th: reception of all the submissions • In 2012, 322 submissions – September 7th: ratings (expertise levels and conflicts) of all the submissions • High, Medium, Low, Very Low, Conflict, X (language) – September 24th: final choice of the 1st internal reviewers for each applications • In 2012, 14 applications as 1st internal reviewer, 15 as 2nd internal reviewer, 17 as reader = 46
  • 12. 12/89 Process in a Nutshell  Off-line work – October 5th: choice by the 1st internal reviewer of 5 external referees • In 2012, 14 applications = 70 • May include referees suggested by the candidate but may also replace all of them – October 22nd: ratings of applications from other evaluation groups • In 2012, 1 application
  • 13. 13/89 Process in a Nutshell  Off-line work – Early December: final list of readings • In 2012, 47 applications – January/February: reception of tge reports from the external referees • In 2012, 123 reports – February 18th to 22nd: competition week in Ottawa during which each application is discussed and rated
  • 14. 14/89 Process in a Nutshell  Off-line work – In 2012 (and I suspect every year), a lot of work! • 322 submissions • 47 applications (including joint publications) • 70 referees • 123 referee reports
  • 15. 15/89 Process in a Nutshell  Off-line work – In 2012 (and I suspect every year), a lot of work! • 322 submissions • 47 applications (including joint publications) • 70 referees • 123 referee reports Make it easier for the reviewers
  • 16. 16/89 Process in a Nutshell  Competition week – February 18th to 22nd: competition week in Ottawa during which each application is discussed and rated – 5 days • In 2012 (and I suspect every year), very intense, demanding, and tiring
  • 17. 17/89 Process in a Nutshell  Competition day – Starts at 8:30am – Divides into • 31 15-minute slots • 2 15-minute breaks • 1 45-minute lunch – Ends at 5:15pm • If no deferred applications to re-discuss – In 2012, 1 application
  • 18. 18/89 Process in a Nutshell  Competition slot – In a 15-minute slot, the ratings of an application are chosen by the reviewers – Or the application is “deferred”, to be re- discussed at the end of the day
  • 19. 19/89 Process in a Nutshell  Competition slot – 1st internal reviewer gives ratings with justifications, which must be facts in the Forms – 2nd internal reviewers contrasts, supports, adds missing facts from the Forms – The readers complement or challenge ratings given by 1st and 2nd internal reviewers, must be supported by facts from the Forms
  • 20. 20/89 Process in a Nutshell  Competition slot – 1st internal reviewer gives ratings with justifications, which must be facts in the Forms • In 2012, a typical presentation follow this pattern – Candidate: career, funding, visibility, publications, HPQ record, planned training – Proposal: context, lacks, characteristics (Incremental? Applicable? Feasible?) – External: summary of the referees' reviews, summary of the provided contributions then, the reviewer would give his ratings
  • 21. 21/89 Process in a Nutshell  Competition slot – 1st internal reviewer gives ratings with justifications, which must be facts in the Forms • In 2012, a typical presentation follow this pattern – Candidate: career, funding, visibility, publications, HPQ record, planned training – Proposal: context, lacks, characteristics (Incremental? Applicable? Feasible?) – External: summary of the referees' reviews, summary of the provided contributions then, the reviewer would give his ratings Not exactly the NSERC criteria
  • 22. 22/89 Process in a Nutshell  Competition slot – Session chair keeps the time strictly – Session chairs makes sure that any discussion sticks to the facts
  • 23. 23/89 Process in a Nutshell  Competition slot – Ratings are anonymous • Secret electronic vote • Session chair announce results – Ratings are consensual • If reviewers/readers strongly disagree, the application will be re-discussed at the end of the day – In 2012, I did not see any strong debates: mostly 1st and 2nd internal reviewers agreed, backed-up by the readers – In 2012, some facts were sometimes highlighted and ratings were changed accordingly
  • 24. 24/89 Process in a Nutshell  Competition slot – Any criteria rated as moderate or insufficient receive comments from the committee, reflecting the consensus of the reviewers (highly focused) • In 2012, NSERC provided typical comments, for example: “The applicant did not take advantage of the available space in Form 100 to make a compelling case about his/her most significant research contributions. Given the lack of information, the EG was unable to carry out a thorough assessment and potentially recommend a higher rating.”
  • 25. 25/89 Outline  Process in a Nutshell – Candidate’s point of view – Internal reviewer’s point of view • Off-line work • Competition week  Funding decisions – In a nutshell – Bins – ER vs. ECR  Criteria and indicators – “Values” of criteria – “Meanings” of indicators – NSERC rating form – My own rating form  Advices – Introduction – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Form 100 – Form 101  Conclusion  Further readings
  • 26. 26/89 Funding Decisions  In a nutshell – Each proposal is rated by the reviewers secretly after the discussions – The medians of the ratings are used for criteria – For example • Excellence of researcher: {S, S, M, M, M}, rating is M • Merit of the proposal: {V, V, S, S, M}, rating is S • Impact of HQP: {V, S, S, S, M}, rating is S • The application rating is therefore {M, S, S}
  • 27. 27/89 Funding Decisions  Bins – The numeric “values” of the ratings are “added” • For example, {M, S, S}  2+3+3 = 8 – The application is placed into one of 16 bins – The bins are labelled A through to P and correspond numerically to 18 down to 3
  • 28. 28/89 Funding Decisions  Bins – Bins A and P are uniquely mapped to {E, E, E} and {I, I, I} while pther bins contain a mix of numerically equivalent ratings, e.g., {V, S, M} is in the same bin as {S, S, S} and {M, S, V} • For example, the application rated {M, S, S} is in K – Not all applications in a bin are funded: {S, S, S} may be funded while {M, S, V} is not • Because of the moderate indicator for the first criteria – Cut-off point depends on year
  • 29. 29/89 Funding Decisions  ER vs. ECR – Candidates are divided into • ER: established researchers, who already applied (funded?) to NSERC DG • ECR: early-career researchers, who apply to NSERC DG for the first time – ECR are funded one bin “lower” (better) than ER
  • 30. 30/89 Outline  Process in a Nutshell – Candidate’s point of view – Internal reviewer’s point of view • Off-line work • Competition week  Funding decisions – In a nutshell – Bins – ER vs. ECR  Criteria and indicators – “Values” of criteria – “Meanings” of indicators – NSERC rating form – My own rating form  Advices – Introduction – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Form 100 – Form 101  Conclusion  Further readings
  • 31. 31/89 Criteria and Indicators  “Values” of criteria – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Cost of research
  • 32. 32/89 Criteria and Indicators  “Values” of criteria – Excellence of the researcher • Knowledge, expertise and experience • Quality of contributions to, and impact on, research areas in the NSE • Importance of contributions
  • 33. 33/89 Criteria and Indicators  “Values” of criteria – Merit of the proposal • Originality and innovation • Significance and expected contributions to research; potential for technological impact • Clarity and scope of objectives • Methodology and feasibility • Extent to which the scope of the proposal addresses all relevant issues • Appropriateness of, and justification for, the budget • Relationship to other sources of funds
  • 34. 34/89 Criteria and Indicators  “Values” of criteria – Merit of the proposal • Originality and innovation • Significance and expected contributions to research; potential for technological impact • Clarity and scope of objectives • Methodology and feasibility • Extent to which the scope of the proposal addresses all relevant issues • Appropriateness of, and justification for, the budget • Relationship to other sources of funds Not really important
  • 35. 35/89 Criteria and Indicators  “Values” of criteria – Merit of the proposal • Originality and innovation • Significance and expected contributions to research; potential for technological impact • Clarity and scope of objectives • Methodology and feasibility • Extent to which the scope of the proposal addresses all relevant issues • Appropriateness of, and justification for, the budget • Relationship to other sources of funds Not really important Amounts of previous grants (in particular NSERC DG) should be ignored
  • 36. 36/89 Criteria and Indicators  “Values” of criteria – Contribution to the training HQP • Quality and impact of contributions • Appropriateness of the proposal for the training of HQP in the NSE • Enhancement of training arising from a collaborative or interdisciplinary environment, where applicable
  • 37. 37/89 Criteria and Indicators  “Values” of criteria – Cost of research • Rationale
  • 38. 38/89 Criteria and Indicators  “Values” of criteria – Cost of research • Rationale Not really important but you cannot have more than what you ask, no matter the merit
  • 39. 39/89 Criteria and Indicators  “Meanings” of indicators – Exceptional – Outstanding – Very strong – Strong – Moderate – Insufficient
  • 40. 40/89 Criteria and Indicators  “Meanings” of indicators – Exceptional • In 2012, I did not see any exceptional ratings – Outstanding – Very strong – Strong – Moderate – Insufficient
  • 42. 42/89 Criteria and Indicators  NSERC rating form – NSERC provides a 2-page rating form • In 2012, I found that this rating form does not follow the presentation pattern during the competition slot because it spreads information • In 2012, however, each application was obviously rated according to the 4 criteria and 6 indicators
  • 43. 43/89 Criteria and Indicators  NSERC rating form (1/2) Applicant: University: Application I.D.: Applicant Status: Title of Application: Evaluation criteria (See Section 6 of Peer Review Manual for complete details) Excellence of researcher(s) Exceptional Outstanding Very Strong Strong oderate Insufficient Knowledge, expertise and experience Quality of contributions to, and impact on, research areas in the NSE Importance of contributions For Team applications: complementarity of expertise between members and synergy Rationale for rating: Merit of the proposal Exceptional Outstanding Very Strong Strong Moderate Insufficient Originality and innovation Significance and expected contributions to research; potential for technological impact Clarity and scope of objectives Methodology and feasibility Extent to which the scope of the proposal addresses all relevant issues Appropriateness of, and justification for, the budget Relationship to other sources of funds Rationale for rating: Contributions to training of highly qualified personnel Exceptional Outstanding Very Strong Strong Moderate Insufficient Quality and impact of contributions during the last six years Appropriateness of the proposal for the training of HQP in the NSE Enhancement of training arising from a collaborative or interdisciplinary environment, where applicable Rationale for rating: Cost of research (relative cost of the proposed research program as compared to the norms for the field) Low Normal High Rationale for Cost of Research:
  • 44. 44/89 Criteria and Indicators  NSERC rating form (2/2) Other comments (e.g., duration should exceptionally be less than norm, special circumstances, quality of samples of contributions provided, environmental impact, ethical concerns. Your Program Officer should be notified accordingly): Summary of assessment by external referees (please highlight any comments that would be deemed inappropriate for the Evaluation Group to consider in their discussions): Points for message to applicant (if rating of “Moderate” or “Insufficient” on any criterion or duration shorter than norm): Discovery Accelerator Supplement (DAS):  Regular DAS: Yes No  DAS in Targeted Areas : Yes No Rationale for DAS Recommendation:
  • 45. 45/89 Criteria and Indicators  My own rating form Career 1: <Indicator> Funding Visibility Publications HPQ Record 3: <Indicator> Planned training Context 2: <Indicator> Lacks Incremental? Applicable? Feasible? Summary of the referees' reviews Provided contributions
  • 46. 46/89 Criteria and Indicators  My own rating form Career 1: <Indicator> Funding Visibility Publications HPQ Record 3: <Indicator> Planned training Context 2: <Indicator> Lacks Incremental? Applicable? Feasible? Summary of the referees' reviews Provided contributions Researcher
  • 47. 47/89 Criteria and Indicators  My own rating form Career 1: <Indicator> Funding Visibility Publications HPQ Record 3: <Indicator> Planned training Context 2: <Indicator> Lacks Incremental? Applicable? Feasible? Summary of the referees' reviews Provided contributions Researcher Proposal
  • 48. 48/89 Criteria and Indicators  My own rating form Career 1: <Indicator> Funding Visibility Publications HPQ Record 3: <Indicator> Planned training Context 2: <Indicator> Lacks Incremental? Applicable? Feasible? Summary of the referees' reviews Provided contributions Researcher Proposal HPQ
  • 49. 49/89 Outline  Process in a Nutshell – Candidate’s point of view – Internal reviewer’s point of view • Off-line work • Competition week  Funding decisions – In a nutshell – Bins – ER vs. ECR  Criteria and indicators – “Values” of criteria – “Meanings” of indicators – NSERC rating form – My own rating form  Advices – Introduction – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Form 100 – Form 101  Conclusion  Further readings
  • 50. 50/89 Advices  Introduction – Reviewers receive 2-3 dozens of applications – Overall, upon firs review, the quality is impressive, thus generating a positive reaction – The objective is to discriminate, however, initiating a vigorous search for flaws
  • 51. 51/89 Advices  Introduction – Reviewers may perceive aspects of applications as confusing, ambiguous, incomplete, or just not compelling – They will not give the benefits of the doubt • In 2012, I witness some excellent researchers receiving low ratings because of sloppiness in their applications
  • 52. 52/89 Advices  Introduction – Reviewers will most likely “mine” the Forms 100, 101, and publications to make up their minds regarding the 4 criteria Make it easy for them to mine your applications!
  • 53. 53/89 Advices  Introduction Career 1: <Indicator> Funding Visibility Publications HPQ Record 3: <Indicator> Planned training Context 2: <Indicator> Lacks Incremental? Applicable? Feasible? Summary of the referees' reviews Provided contributions
  • 54. 54/89 Advices  Introduction Career 1: <Indicator> Funding Visibility Publications HPQ Record 3: <Indicator> Planned training Context 2: <Indicator> Lacks Incremental? Applicable? Feasible? Summary of the referees' reviews Provided contributions Form 100
  • 55. 55/89 Advices  Introduction Career 1: <Indicator> Funding Visibility Publications HPQ Record 3: <Indicator> Planned training Context 2: <Indicator> Lacks Incremental? Applicable? Feasible? Summary of the referees' reviews Provided contributions Form 100 Form 101
  • 56. 56/89 Advices  Introduction – Form 100 • Is used for two of the three important criteria – Form 101 • Is used for the merit of the proposal mostly
  • 57. 57/89 Excellence of the Researcher  Form 100 – Career: pages 1-2 – Funding: pages 3-… – Visibility • “Other Evidence of Impact and Contributions” • Awards, chairing, editorship, organisation, seminars: anything showing external acknowledgments – Publications • “Other Research Contributions” • Quantity and quality
  • 58. 58/89 Excellence of the Researcher  Form 101 – Essentially nothing  Contributions – Important for the experts, should be explained for the non-experts in Form 100, “Most Significant Contributions to Research”  External reviewers – Confirm/contrast findings in the Form 100, 101, and the publications
  • 59. 59/89 Merit of the Proposal  Form 101 – Context • Is the application well positioned? – Lacks • Any problems not discussed? – Incremental? • How innovative? – Applicable? • Usefulness, even remote? – Feasible? • Methodology
  • 60. 60/89 Merit of the Proposal  Form 101 – Reviewers may also look for • Knowledge of the key issues (background) • Originality and innovation (background limits) • Clarity of scope and objectives • Methodology – Trust/confidence that you can do work • Significance
  • 61. 61/89 Merit of the Proposal  Form 100 – Essentially nothing  Contributions – Essentially nothing  External reviewers – Confirm/contrast findings in the Form 101
  • 62. 62/89 Contribution to the Training of HQP  Form 100 – HPQ Record • Pages 1, 5-… • Make it consistent, report what the students do now – Planned training • “Contributions to the Training of Highly Qualified Personnel”
  • 63. 63/89 Contribution to the Training of HQP  Form 101 – Last part on “Contribution to the Training of Highly Qualified Personnel (HQP)”  Contributions – Essentially nothing  External reviewers – Confirm/contrast findings in the Form 100, 101
  • 64. 64/89 Forms 100 and 101  In 2012, in general, my reading/rating was made easier when the application carefully followed the NSERC suggested guidelines/templates – Any missing/misplace/different category was disruptive and sent a bad signal: “I want to do differently from others”
  • 65. 65/89 Form 100  In 2012, here is what I particularly looked at “TRAINING OF HIGHLY QUALIFIED PERSONNEL” – Total numbers of students to know if the candidate is actively supervising students – Any increase/decrease – Any inflation of the numbers just before the submission, it sends a bad message
  • 66. 66/89 Form 100  In 2012, here is what I particularly looked at “ACADEMIC, RESEARCH AND INDUSTRIAL EXPERIENCE” – Current and past positions – Any industrial experience – Experiences that could explain the absence of publications, of HQP
  • 67. 67/89 Form 100  In 2012, here is what I particularly looked at “RESEARCH SUPPORT” – The candidate, given experience, having appropriate funds? • NSERC and industry are most important • Others can help but should be explained, possibly in the contributions (F100) or in the budget (F101) • Amounts are not so important but give a signal • They shows the candidate’s willingness, relevance
  • 68. 68/89 Form 100  In 2012, here is what I particularly looked at “HIGHLY QUALIFIED PERSONNEL (HQP)” – The candidate’s contribution to the training of past/current HQP • Titles of the projects should be meaningful and focused; unrelated titles send a bad signal • “Present positions” are important and show that the candidate follows his/her students • The degree obtained/pursued by the HQP (Ph.D.? M.Sc.? B.Sc.? Others?)
  • 69. 69/89 Form 100  In 2012, here is what I particularly looked at “MOST SIGNIFICANT CONTRIBUTIONS TO RESEARCH” – Evidence of scientific contributions • I ask myself “do I know this candidate?” If unknown, I searched the places of publications to assess the quality of the contributions • References must better include where the papers were published to ease the reviewer’s task • The contributions should be related to the topic of the current NSERC DG to show continuation
  • 70. 70/89 Form 100  In 2012, here is what I particularly looked at “ARTICLES IN REFEREED PUBLICATIONS” – Evidence of scientific contributions • Quality was first and foremost: I looked at the venues and assessed their quality – Acronyms must be explained, publisher names, years… must be given, systematically – Bibliographic metric values may be given; Google Scholar citation counts is accepted; others metrics are discounted • Quantity was a plus but, without quality, it sent a bad signal: “I publish without focus”
  • 71. 71/89 Form 100  In 2012, here is what I particularly looked at “OTHER EVIDENCE OF IMPACT AND CONTRIBUTIONS” – Evidence of external acknowledgments • Anything that could help me confirm my impression on the merit of the candidate: awards, chairing, editorship, organisation, seminars – Unknown venues just to “fill in” that part must be avoided • Lack thereof was sending a bad signal: “I am not involved in the community” or “The community does not want me”
  • 72. 72/89 Form 101  In 2012, here is what I particularly looked at “TITLE OF THE PROPOSAL” – The title, which must be relevant and accurate, for a quick understanding (or not!) of what the proposal is all about
  • 73. 73/89 Form 101  In 2012, here is what I particularly looked at “PROVIDE A MAXIMUM OF 10 KEY WORDS…” – The keywords, which must be relevant and accurate, to get an deeper idea of what the proposal is all about
  • 74. 74/89 Form 101  In 2012, here is what I particularly looked at “TOTAL AMOUNT REQUESTED FROM NSERC” – The total amount requested, which could raise my curiosity if “too” high (say, more than 70 K) or “too” low (say, less than 30 K) • I would then read the budget to understand the detailed numbers
  • 75. 75/89 Form 101  In 2012, here is what I particularly looked at “SUMMARY OF PROPOSAL FOR PUBLIC RELEASE” – This very important part • In my domain, would help me understand the application and form an idea of what to expect • Outside my domain, would help me understand the application as a whole • I would look for clear objectives, expected results and their significance, and contributions to HQP • Any typo or grammar error raised warning!
  • 76. 76/89 Form 101  In 2012, here is what I particularly looked at “TOTAL PROPOSED EXPENDITURES” – The use of the money towards HQP • “Most” of the money should go to fund HQP; I mean at least 80%-90% • But some money should be used for students’ travels (not only for the candidate’s) • I would also look for any “surprising” amounts in any “surprising” category
  • 77. 77/89 Form 101  In 2012, here is what I particularly looked at “BUDGET JUSTIFICATION” – The use of the money towards HQP • It should include a table showing, per year, the use of the money on students and possibly pictures of uncommon equipment • Budget is not so important but errors or lack of justification sent a bad signal • Also, it is important to be consistent: 5 K are not enough to equipped a whole lab. with computers… • It should also explain any specific institutional rules or complementary source of funds • When naming conferences for travel, only name conferences relevant to the proposal and possibly justify the choice
  • 78. 78/89 Form 101  In 2012, here is what I particularly looked at “RELATIONSHIP TO OTHER RESEARCH SUPPORT” – Possible duplication of funds • If there is a clear relation between two or more funds, in that case, it should be clear how the money of the NSERC DG will be used, e.g., students
  • 79. 79/89 Form 101  In 2012, here is what I particularly looked at “PROPOSAL” – The context, objectives, expected results in the first few lines or maybe an example to help me understand what the proposal is all about – A story – Any missing/misplace/different category, which was disruptive and sent a bad signal: “I want to do differently from others”
  • 80. 80/89 Form 101  In 2012, here is what I particularly looked at “PROPOSAL” – An up-to-date background • Few recent references are interesting • Some discussions of the references and of their contributions and limitations is definitely needed • If possible some clear, simple, and convincing running examples illustrating the references limits
  • 81. 81/89 Form 101  In 2012, here is what I particularly looked at “PROPOSAL” – The contrast with the background • Then, the proposal should explain how it will improve on the limits of the state-of-the-art or state-of-the- practice and demonstrate its feasibility • Then, it should also explain how it will go beyond clear feasibility to show innovation • It must be limited to 3 main objectives, with milestones and students
  • 82. 82/89 Form 101  In 2012, here is what I particularly looked at “PROPOSAL” – Clearly stated goal(s), challenge(s), expected results and their significance, for each objective • The proposal should not hide weaknesses and challenges but explain them and the methodology to lessen their impacts • It should also explain how to evaluate the success/ failure of a goal and what happens “if…”
  • 83. 83/89 Form 101  In 2012, here is what I particularly looked at “PROPOSAL” – The association of students with the objectives • Possibly, the proposal should explain the recruiting of students in the next section, “Contribution to HQP” – Its possible significance • On society, not on yourself or your small community • Use of other, existing funds is “proof” of interest from external, broader society
  • 84. 84/89 Form 101  In 2012, here is what I particularly looked at “CONTRIBUTION TO HQP” – The context of training and its methodology • It is a chance to explain how this application will concretely be used to train students • It is better if there are already some student names • If the merit of the candidate is low and the candidate boast too many students, it can be negative
  • 85. 85/89 Form 101  In 2012, here is what I particularly looked at “REFERENCES” – Consistency • References should be consistent and well-written, again, any typo or grammar errors raised a warning!
  • 86. 86/89 Outline  Process in a Nutshell – Candidate’s point of view – Internal reviewer’s point of view • Off-line work • Competition week  Funding decisions – In a nutshell – Bins – ER vs. ECR  Criteria and indicators – “Values” of criteria – “Meanings” of indicators – NSERC rating form – My own rating form  Advices – Introduction – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Form 100 – Form 101  Conclusion  Further readings
  • 87. 87/89 Conclusion  Follow the guidelines / templates carefully  Be simple, clear, straightforward – Even with the weaknesses  Do not forget anything but do not show off either, explain, explain, explain  Avoid at all costs typo / grammar errors
  • 88. 88/89 Outline  Process in a Nutshell – Candidate’s point of view – Internal reviewer’s point of view • Off-line work • Competition week  Funding decisions – In a nutshell – Bins – ER vs. ECR  Criteria and indicators – “Values” of criteria – “Meanings” of indicators – NSERC rating form – My own rating form  Advices – Introduction – Excellence of the researcher – Merit of the proposal – Contribution to the training HQP – Form 100 – Form 101  Conclusion  Further readings
  • 89. 89/89 Further Readings  NSERC Discovery Grants – An Insiders View by Larry W. Kostiuk (Co-Chair, Fluids GSC 1512)  How to Succeed in the New NSERC Discovery Grant Competition Model by Evangelos Milios, Nur Zincir-Heywood, and Stavros Konstantinidis  NSERC Discovery Grant Applications: Hints and Insights by Jason P. Leboe-McGowan  Advice on NSERC Discovery and RTI Applications by Robert Bridson