Our mission at the Active Inference Journal is to increase the accessibility and quality of the transcripts of livestreams.
Here are some updates from 2022 that reflect our success towards this mission.
Livestream transcripts are now published and citable with a Digital Object Identifier (DOI). For example, here is the transcript of "The free energy principle made simpler but not too simple", Livestream #045, with 2 hours of background and 4 hours of conversation with Karl Friston and Thomas Parr. Another example is GuestStream #016 with Mark Solms, where the 4-hour transcript was more extensively manually edited and enriched with images.
All transcripts and associated files are versioned in a central open source repository. People’s contributions and additions (e.g. fixing typos, adding code simulations, developing translations) will be ongoing in a distributed fashion.
In the interactive table of livestreams, each row has links to the video, Github folder, published prose transcript, and DOI. Where possible, there are also links to the slides used and papers discussed.
We’ve demonstrated a reliable capacity to publish transcripts (captions, and prose with DOI) within hours of a livestream. This opens up exciting frontiers in digital scholarship, at the Institute and beyond.
We maintain all of our open-source tools in the Journal Utilities repository, which will enable distributed development and application of our pipeline.
We continue to apply System Engineering and Active Inference to develop accessible and productive ways of working.
In 2023, we will continue active development of this project, and increase the quantity and quality of Active Inference Journal outputs.
We hope you will get involved in this effort.
Some directions for 2023 include:
We will continue to increase the accessibility and indexability of the Active Inference Journal.
We will continue to rapidly deploy transcripts within hours after livestreams, and iterate on them in our Github repository.
We will leverage advances in speech and text processing.
We will use the language translations we have for the Active Inference Ontology https://coda.io/@active-inference-institute/active-inference-ontology-website/actinf-ontology-translations-4 to scaffold translations across languages, and synthesize perspectives across different discourse communities.
To learn more about contributing to the Active Inference Journal project, please see the Volunteer and Internship opportunities, or join our Discord.