UNICEF Venture Fund: AI and Blockchain for Data and Trust (up to US$100K in equity-free funding)

2
202
UNICEF Venture Fund: AI and Blockchain for Data and Trust (

Deadline: 23rd December 2024

The UNICEF Venture Fund is looking to invest in Open Source frontier technology solutions that have the potential to create radical change for children. We are offering up to US$100K in equity-free funding for early-stage, for-profit technology start-ups that can improve the lives of children.

UNICEF Venture Fund: AI and Blockchain for Data and Trust (If your company is leveraging cutting-edge technologies such as artificial intelligence (AI), machine learning (ML), or blockchain, they want to hear from you!

They are specifically seeking companies registered in one of UNICEF’s programme countries that have impressive working prototypes and a commitment to Open Source licensing and practices.

Benfits

They are offering up to US$100K in equity-free funding for early-stage, for-profit technology start-ups that can improve the lives of children.

Area 1

Misinformation and Disinformation

Are you creating tools, platforms or games leveraging new technologies to verify information and combat misinformation and/or disinformation? Or are you delivering behavioral interventions to consistently inform young people about misinformation and/or disinformation? We are particularly interested in approaches that address mis/disinformation in multiple languages and formats (e.g. audio, video, image) and encourage platforms accessible to persons with disabilities. Potential application could explore solutions such as:

  • Game-based social and behavioral change interventions or platforms to help identify mis/disinformation
  • Mechanisms to review information, identify mis/disinformation, and/or provide legitimacy to true information shared online, for example software that can detect deepfakes in videos and images
  • Platform-agnostic tools using data science and AI to identify and analyze false or inaccurate content, and track the source and/or spread of this content
  • Interactive tools or games for children to engage with and learn about fact-checking
  • Mechanisms to tag data or watermark content at the source or while its being circulated to increase trust in it
  • Tools which leverage the power of crowds to collectively monitor or identify data inaccuracy and to build trust through the power of social networks
  • Tools which can eliminate the need for third party auditing through innovative use of blockchains in a data collection or recording use case
  • Tools to audit social media platforms which are non-transparent (such as platforms which do not disclose their content recommendation algorithms or peer-to-peer messaging platforms), to determine how effective they are at removing or labelling mis/disinformation
  • AI-enabled systems to manage mis/disinformation during crises and ensure the dissemination of accurate information.

Area 2

Data generation, collection and analysis

Are you using novel approaches to compile and validate large amounts of training data? Or creating new data through field data collection, crowdsourcing, or social network platforms? This could include use cases such as:

  • Building safe and secure data collection and management systems following Open standards (for transparency and accountability) while anonymizing sensitive data or leveraging privacy-enhancing technologies
  • Developing models to analyse large amounts of data, generate insights for decision-making and resource allocation
  • Identifying methods to manage emotional or cognitive bias in data collection
  • Generating new data through field data collection, crowdsourcing or social network platforms for understanding trends and conducting situational analysis
  • Providing transparency and accountability to how data is collected, managed, analysed, bench-marked, and generated
  • Developing systems for navigating existing resources and available information

Area 3

Digital Trust 

Are you leveraging existing and new technologies to build digital trust? Or are you generating insights to assess and mitigate the threats and harms for children in digital environments? We are seeking startups that are building new tools, for instance:

  • Decentralized protocols for content ownership, attribution, and licensing using blockchain technology
  • ML/AI applications to monitor and model potential online risks to children, including those generated by AI systems
  • Blockchain tor AI tools to ensure credible proof of humanity and secure “KYC” processes.
  • Tools that use digital footprints from sources like social media or mobility patterns to generate insights, such as risk analyses or forecasts to trigger interventions before a crisis occurs
  • Tools that leverage blockchain to verify online content, for example by creating trusted collections of information voted on by verified sources against transparent criteria
  • Systems which improve data provenance and auditability
  • Game-based educational tools and guidance for children to learn about the concepts of privacy, respect and sharing of content online

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here