Collaborative Expedition Workshop #79, The Science of Science Policy (40XU)
Remote Session - Session One (40XS)
Workspace: Let's use the chat room at: http://webconf.soaphub.org/conf/room/colab_2008_12_17/ (where the conversation has already begun, in a sense.) (40YE)
Participants: (40YA)
Pick one of the theme: (40YD)
- Tool 1 - Understanding Science and Innovation (40XX)
- Tool 2 - Investing In Science And Innovation (40VH) (40Y0)
- Tool 3 - Using the Science of Science Policy to Address National Priorities (40Y5)
- Tool 4 - Identifying Data Needs for Implementing SoSP (40Y6)
We will pick theme-4: "Identifying Data Needs for Implementing SoSP" (40YL)
Discussion (40YN)
from the earlier chat conversation: (40YW)
- Joi Grieg: yes - I have a question? What is being done to understand the optimal ecosystems so there is faster diffusion of knowledge and the accompanying impact of speeding practical usage and to have a virtuous circle with government, industry, academia, associations and other players? (40YO)
- Jeff Alexander: There are a few NSF grants awarded to researchers studying that topic (40YP)
- Jeff Alexander: you can search recent awards at http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=501084&org=SES&from=home (40YQ)
- Joi Grieg: thanks (40YR)
- Jeff Alexander: np. I think there's an emerging consensus that an optimal ecosystem is tough to define, because it will differ across scientific disciplines and technology sectors (40YS)
- Joi Grieg: I find the use of community maps to understand the roles of folks within a specific context useful. I haven't ever lifted that the macro level of this conversastion. (40YT)
- Jeff Alexander: I'm actually developing something like that for a client, around acceleration of vaccine development (40YU)
- Joi Grieg: Great - I'm doing a version of it for the IAC Emerging Technology SIG and my company on post-Research but not 'business as usual' IT technologies. Not exhaustive but strong on federal gov't, funded groups like the National Labs and FFRDCs, and others. (40YV)
This group's discussion: (410Q)
- Joi: how would participants (rather than policy maker) benefit from the SoSP data needs solution, and improve their management of their processes, innovation and scientific knowledge. (410R)
- Peter: I would strongly recommend the use of ontologies and semantic technologies so that we can best leverage the data and information collected (410S)
- Peter: involve as broad a community as possible and use an "open collective intelligence" approach (supported by web 2.0 tools) (410T)
- Joi: we also may need to find out how fast are we moving from research to practical or commercial usage? (410U)
- Joi: what other criteria? maybe we could use "balanced score card" to provide us with a dashboard view of the SoSP measurements and level of "success" (410V)
PeterYim: Other comments (ref. the Tool-4 survey) (415B)
Workshop Decision Tool: Theme 4: Identifying Data Needs for Implementing SoSP (415C)
Question-1: Supporting Research (415D)
- Option 3: Establish a shared research environment with award data for the research community to develop appropriate ontologies to track research (415E)
- this is good ... but should be extended to ALL datasets, rather than just “research data.” ... Also, the support of a global “Open Ontology Repository” will be crucial to the success of this approach. (415F)
- Option 11 Encourage agencies to develop a common taxonomy for R&D (415G)
- yes, to the extent possible ... but then, there are sever limitations as to how much one can “standardize” across disparate domains. The use of ontologies and better semantic technology would be the real answer (rather than standardizing on a taxonomy or a controlled vocabulary) (415H)
- Option 12: Encourage agencies to collect information on subawards and subprojects (415I)
- yes ... in a sense ... we should encourage awardees to “expose” the information, rather than the agencies to “collect” those information. (415J)
Question-2: Measuring and tracking the scientific workforce (415K)
- Option 2: Invest in the development of a portal for all datasets (federal and non federal) that capture information about the STEM workforce (415L)
- that, as well as an open ontology repository that researchers, analysts and other system developers may make use of. (415M)
- Option 3: Use wage record data or IRS data to track the earnings and employment path of post doctoral researchers by matching with funding agency administrative records (415N)
- I don't think “wage” or “income” data makes a good measure ... especially when dealing with strategic competitiveness type assessment (which we are trying to do.) (415O)
- Option 4: Use wage record data or IRS data to track the earnings and employment path of graduate students by matching with funding agency administrative records (415P)
- same comment as option-3 (415Q)
- Option 5: Use cybertools to scrape the web and track the earnings and employment path of post doctoral researchers by matching with funding agency administrative records (415R)
- as ad hoc research, yes ... but as a systematic means of data-collection and analysis, this approach would probably be skewed (as some domains show up on the web much more than others.) (415S)
- Option 6: Use cybertools to scrape the web and track the earnings and employment path of graduate students by matching with funding agency administrative records . (415T)
- same comment as option-5 (415U)
- Option 9: Encourage agencies to use unique PI identifiers so that federal funding can be tracked across agencies (415V)
- this would be redundant, if we already have unique identifiers for a “person” ... we should, rather see, how to leverage already existing ID scheme(s). (415W)
- Option 10: Add questions to current federal surveys of organizations to capture information about the STEM workforce (415X)
- possibly, but done with great care not to cause new increase in work load and bureaucracy. (415Y)
- Option 11: Add questions to current federal surveys of individuals to capture information about their STEM training (415Z)
- same comment as option-10 (4160)
Question-3: Measuring and tracking scientific outcomes: (patent data; publication data; citations etc.) (4161)
- Option 3: Standardize PI annual reports across funding agencies to capture journal articles in standardized form (4162)
- again, only to the extent possible ... using semantic technology to help integration of the data without having to “standardize” would probably be a better approach. (4163)
- Option 9: Require that PI’s cite funding source in standardized form in publications (4164)
- this should be basic ... as recognizing your funding source, should, indeed, be part of the attribution requirement anyway. (4165)
- Option 10: Require that PI’s cite funding source in standardized form in working papers (4166)
- same comment as option-9 (4167)
- Option 11: Require that PI’s cite funding source in standardized form in patents (4168)
- same comment as option-9 (4169)
Question-4: Measuring and Tracking Competitiveness (416A)
Question-5: Analytical Access by researchers and federal government agencies (416C)