In A Nutshell
Play a critical role in supporting the CRAF’d mission to finance and scale data-driven solutions for crisis action.
Responsibilities
Fund Impact & Ecosystem Analytics
- Data Ecosystem Mapping: Expand our existing mapping of the crisis data ecosystem and financial‑flow analysis.
- Partner Database: Continuously update and analyze our partner‑network database, ensuring it remains current and actionable.
- Project Reporting: Consolidate and analyze historical and ongoing project data.
- Cost‑Effectiveness Analysis: Develop models to estimate the cost-effectiveness of project and analyze budget allocations across projects.
- Impact Estimation: Analyze how CRAF’d investments shape decision‑making and resource allocations; produce impact metrics for internal and external reporting.
Project Prioritization & Feasibility Assessment
- Application Process Design: Streamline and enhance the project priotization process.
- Scoring Methodology Development: Build a data‑driven, AI‑assisted framework to score prospective projects on impact potential, feasibility, and strategic fit.
Digital Tools & Reporting
- Dashboard & Framework Design: Create interactive dashboards (e.g., Power BI, Plotly) for real‑time tracking of portfolio metrics and risk indicators.
- Analytical Briefings: Prepare concise, data‑backed reports and slide decks to guide Fund Manager and Steering Committee decisions.
- Fund‑Management Tools: Co‑design and iterate on internal tools that optimize resource allocation, automate routine analyses, and ensure data integrity.
Skillset
- Be currently enrolled in a Master’s degree (or higher), OR in the final year of a Bachelor’s degree in one of the following fields: Computer Science, AI / ML, Data Science, Statistics, Economics, Public Policy, Political Science, International Affairs, Development Studies, Business, or a related field. OR have already graduated with a university degree and, if selected, must start the internship within one year of graduation.
- Programming & Scripting: Proficiency in Python and R for data cleaning, processing, statistical analysis, and modeling. Experience with version control using Git and collaboration via GitHub.
- Analytics & Visualization: Skilled in exploring, visualizing, and interpreting complex datasets using tools like ggplot2, matplotlib, Datawrapper, and Plotly; proficient in building interactive dashboards with Power BI or similar software.
- APIs & Integration: Familiarity with RESTful APIs for data ingestion and integration across systems.
- Data Management: Experience working with big data and relational (SQL) or graph databases (e.g., Neo4j). Familiar with common data standards and best practices for reproducible, transparent and ethical data handling.
- Machine Learning & AI: Understanding of core machine-learning techniques, experience training and evaluating models, and prompt engineering.
- Proficient in written and spoken English.