PARSEC
Building New Tools for Data Sharing and Re-use through a Transnational Investigation of the Socioeconomic Impacts of Protected Areas
PARSEC is a project funded by the Belmont Forum, selected after the call for projects on “Science-driven e-infrastructure Innovation (SEI)”. The project is led by Nicolas Mouquet, also CESAB‘s Scientific Director, and brings together several partners from France (FRB, University of Toulouse), Brazil (University of São Paulo), United States (American Geophysical Union – AGU) and Japan (National Institute of Information and Communications Technology – NIICT).
Scientific advances depend on the availability, accessibility and reusability of data, software, samples, and data products. Yet large amounts of data on the Earth are not well preserved or preserved at all. PARSEC will address these questions by supporting the collaboration of a synthesis-science team and data-science team.
- The Synthesis Science team is employing artificial intelligence techniques to analyse satellite images and socio-economic information to better predict and mitigate the effect(s) of actions that potentially threaten the livelihoods and health of local (indigenous) communities. Like most researchers who investigate complex environmental problems, the team depends significantly on the availability of good, spatially dispersed, multidisciplinary, and time-series data. The synthesis-science team is hosted at CESAB for the duration of the project (4 years).
- The Data Science team, composed of leading environmental data management professionals, data communities (RDA, ESIP), society journals (AGU), and representatives of e-infrastructures for data attribution (e.g., DataCite and ORCID), will develop leading practices on data citation, attribution, credit, and reuse. As part of the integrated work with the synthesis-science team, the data-science team will provide a review of best practices for data management and stewardship using this effort as a case study of the wider scientific community to optimise data access and reuse. The team will also develop and implement a new tool to better track data usage and reuse for researchers.
PARSEC is a trans-disciplinary and trans-national project working on the use and re-use of environmental and socio-economic data to assess practices for data management and conservation. The project provides a unique opportunity for data scientists and synthesis scientists to collaborate in real-time toward the goal of improving research outcomes and data sharing. The resulting tools and metrics will enable better prediction and mitigation of the effect of actions that disrupt historical land use practices and threaten local communities.
More information on PARSEC website.
Country leads:
Brazil, France, Japan, United States.
Co-operating Partners:
Australia – Lesley WYBORN; United Kingdom – Helen GLAVES.
Partners:
DataCite; Earth Science Information Partners (ESIP); Environmental Data Initiative (EDI); ORCID; Research Data Alliance (RDA); Sharing Rewards and Credit (SHARC) Interest Group (RDA); World Data System (WDS); Academia Sinica (Taipei); The John Welsey Powell Center (JWP); The Nature Conservancy (TNC).
PI:
Nicolas MOUQUET – MARBEC, CNRS Montpellier, FRB (France)
Co-PIs:
David MOUILLOT – MARBEC, University of Montpellier (France); Alison SPECHT – University of Queensland (Australia); Shelley STALL – American Geophysical Union (USA).
Postdocs:
Ali BEN ABBES – FRB-Cesab (France); Jeaneth MACHIATO – University of Sao Paulo (Brazil)
PARSEC synthesis-science team brings together experts in Data Science, Conservation Biology, Remote Sensing, Data Mining, Artificial Intelligence, Protected Area, and Social Science.
PARSEC was selected from the 2018 Belmont Forum call for proposals.
[08] Ben Abbes A, Machicao J, Corrêa PL, Specht A, Devillers R, Ometto JP, Kondo Y & Mouillot D (2024) DeepWealth: A generalizable open-source deep learning framework using satellite images for well-being estimation. SoftwareX, 27, 101785. DOI: 10.1016/j.softx.2024.101785.
[07] Mouillot D, Velez L, Albouy C, Casajus N, Claudet J, Delbar V, Devillers R, Letessier TB, Loiseau N, Manel S, Mannocci L, Meeuwig J, Mouquet N, Nuno A, O’Connor L, Parravicini V, Renaud J, Seguin R, Troussellier M & Thuiller W (2024) The socioeconomic and environmental niche of protected areas reveals global conservation gaps and opportunities. Nature Communications, 15, 9007. DOI: 10.1038/s41467-024-53241-1.
[06] Specht A, O’Brien M, Edmunds R, Corrêa P, David R, Mabile L, Machicao J, Murayama Y & Stall S (2023) The Value of a Data and Digital Object Management Plan (D(DO)MP) in Fostering Sharing Practices in a Multidisciplinary Multinational Project. Data Science Journal, 22, 38. DOI: 10.5334/dsj-2023-038.
[05] Machicao J, Ben Abbes A, Meneguzzi L, Corrêa PLP, Specht A, David R, Subsol G, Vellenich D, Devillers R, Stall S, Mouquet N, Chaumont M, Berti-Equille L & Mouillot D (2022) Mitigation strategies to improve reproducibility of poverty estimations from remote sensing images using deep learning. Earth and Space Science, 9, e2022EA002379. DOI: 10.1029/2022EA002379.
[04] Machicao J, Specht A, Vellenich D, Meneguzzi L, David R, Stall S, Ferraz K, Mabile L, O'Brien M & Corrêa P (2022) A deep-learning method for the prediction of socio-economic indicators from street-view imagery using a case study from Brazil. Data Science Journal, 21, 1–15. DOI: 10.5334/dsj-2022-006.
[03] Specht A & Crowston K (2022) Interdisciplinary collaboration from diverse science teams can produce significant outcomes. PLoS ONE, 17, e0278043. DOI: 10.1371/journal.pone.0278043.
[02] David R, Mabile L, Specht A, Stryeck S, Thomsen M, Yahia M, Jonquet C, Dollé L, Jacob D, Bailo D, Bravo E, Gachet S, Gunderman H, Hollebecq J-E, Ioannidis V, Le Bras Y, Lerigoleur E & Cambon-Thomsen A (2020) FAIRness Literacy: The Achilles' Heel of Applying FAIR Principles. Data Science Journal, 19, 32. DOI: 10.5334/dsj-2020-032.
[01] Specht A, Corrêa P, Belbin L & Loescher HW (2020) Critical research infrastructure: The importance of synthesis centers. Elephant in the Lab. DOI: 10.5281/zenodo.3660920.