Chris started by pointing out what is needed to fuel AI tools and make them function properly and the physical costs for society that are often overlooked:
- AI requires huge amounts of structured data, but with 50% of humanity not digitally connected, how is it possible to ensure representation?
- Physical facilities (data centres) occupy a large amount of land, preferable close to large populations. They also consume significant quantities of water and energy. Their location poses serious ethical and societal questions.
Up to now the use of AI tools in the philanthropy field is generally limited to speeding up processes and as a ‘brainstorming’ partner, through for example data processing, primary text presentation, translation. first draft to work on, speed up the process, and be a brainstorming partner.
He shared some of the things that community foundations could do:
- Aggregation of data at local, national or regional levels and making sense of sense data by exploring patterns and intersections;
- Advocating for more open data, especially with governments, to have broader access and understand what is available and missing.
Chris highlighted the fact that most disadvantaged and marginalised communities are not online generating data for data sets and the analysis that may be shaping the delivery of public services. Finally, he touched upon the political implications: With the most significant election year of all time, AI-generated misinformation has already affected several countries, such as India and Bangladesh. Deepfakes have also repeatedly been registered in the US. All this drives further polarisation and pulls apart the social fabric that civil society organisations then have to stitch back together.
Alex Tveit posed the question “Why should community foundations care?” He brought system thinking to frame the conversation, quoting Donella Meadows: “A system is an interconnected set of elements that is coherently organised in a way that achieves something. A system must consist of three kinds of things: elements, interconnections, and a function or purpose.”
He suggested technologies like AI exist in interconnected systems and are underpinned by a foundation of digital equity, whose pillars are connectivity, affordability, digital literacy, and hardware availability. As community foundations might explore how to move from being merely a taker of technology to engaging and understanding AI actively, it is crucial to understand interconnectedness. They need to embrace a mindset of entanglement, the interconnected nature between wealth, risk, and the broader socio-economic environmental systems. This approach will help foundations shape AI in a way that aligns with their values and ensures that AI benefits are distributed equitably and contribute to an inclusive and sustainable future.
He strongly underlined how inclusion is key in the development of AI and how incorporating a full spectrum of perspectives is essential — not only people but also the environment, for instance. Without them, we risk AI being shaped by a limited world view, potentially overlooking the unique needs and values of diverse communities.
Moving to the potential, Alex reiterated that AI is cross-sectoral, offering gamechanging opportunities with its predictive ability potentially bringing advancements in a range of fields from healthcare to climate science. However, this he suggested only scratches the surface, especially for social impact purposes. Unless we have inclusion, they will be developed by the eye and perspective of the very few.
He shared four inclusive examples:
- First Languages AI Reality (FLAIR) Initiative, FLAIR’s goal is to develop a method for rapid creation of custom ASR models for Indigenous languages;
- Indigenous AI Abudance Intelligences research programme imagines anew how to conceptualize and design Artificial Intelligence (AI) based on Indigenous Knowledge (IK) systems
- PolArctic oceanographic, and data science company with a focus on creating tailored products on the Arctic.
- DrumBeat.AI, artificial intelligence to beat ear disease in Aboriginal and Torres Strait Islander kids.
Furthermore, drawing from the Mars user-prompted format example, he brought up an open question: ‘What could a similar structure mean to the collection of community foundation data and wider non-profit data by sharing knowledge, models, and case studies, and how can we use them to collaborate around data comment and have people surrounded by things that help them do their work?’
Focusing on concerns, Alex mentioned:
- How could AI be used to make decisions that shape the trajectory of our lives, from the profoundly impactful, like what kind of jobs we get and how much we are paid, to the cost of groceries?;
- There is an upcoming significant shift in the job market, with nearly half of current jobs being automated overtime and related retraining needs;
- The existing bias and discrimination, with no proper representation of marginalised populations, will perpetuate social prejudices and inequities;
- Societal issues, notably in relation to politics and elections.
He ended up with a further remark: ‘everyone must be included in shaping AI – working towards AI systems that are human-centered, inclusive, ethical, and sustainable, and which uphold human rights and the rule of law.
- The conversation with participants addressed highlighted the issues below: Community-generated data issues are still an untackled opportunity and require a vast collaborative effort. The data sets used are giant in LLMs but can also be confined. The main question is how, together as community foundations, we can build a collaborative tool and process to have a knowledge base of the collective.
- How can AI-driven models help us gather further individual and collective insights into heavy relational bonds, such as those with donors in the community and the wider world?
- Up to now community foundations have experimented with AI primarily if a limited range of operational areas, such as communications, back office, checking policies, and grants agreements. But it was noted that there are already out there automation tools to speed up the workloads (Chris mentioned UIPath, for instance, or the Microsoft Suite – Power Automate)
- Lack of confidence around communities and grantees in using AI and the need to develop policies around it.
- ChatGPT written applications: how do we respond? With ChatGPT available, many organisations are seeing an increase in AI generated applications. It was questioned if these should be screened out or this be recognised as a valid way of achieving inclusion. Greater attnetion should therefore be paid to information required and quality of questions being asked.
- Can AI release the staff of community foundations to do more of the connecting, human-driven, work? Also, how can we leverage tools such as mapping tools and make collective information available visually to find new ways to drive resources to smaller marginalised civil society organisations and movements?