https://www.scieneers.de/wp-content/uploads/2026/03/ms-ai-tour-launch-city.jpg
960
1472
Martin Danner
https://www.scieneers.de/wp-content/uploads/2020/04/scieneers-gradient.png
Martin Danner2026-03-13 15:42:092026-03-13 15:48:13Insights from the Microsoft AI Tour 2026 in Munich – Between Vision, Sovereignty and Real‑World ApplicationsInsights from the Microsoft AI Tour 2026 in Munich
Between Vision, Sovereignty and Real‑World Applications
In February we had the opportunity to attend the Microsoft AI Tour in Munich by invitation. The day was filled with insightful talks, technical deep dives, and strategic perspectives on how artificial intelligence is evolving and what role it will play for companies, research institutions, and society in the coming years.
AI as the Bridge Between Domain Models and LLMs
One of the most memorable moments was the keynote by Satya Nadella. Instead of presenting a traditional product showcase, he chose to highlight the potential of modern AI through a research project in the healthcare domain.
At the center of the presentation was a model called GigaTIME. This multimodal AI framework can generate virtual multiplex immunofluorescence (mIF) images from traditional H&E tissue slides, enabling deeper analysis of interactions between tumor cells and the immune system. Trained on data from around 40 million cells, the model enables population‑scale analyses across thousands of patients and helps identify relationships between proteins, biomarkers, tumor staging and survival rates – analyses that were previously difficult due to the limited availability of such data.
Even more interesting than the demo itself was Satya Nadella’s reflection afterward. Paraphrasing his statement:
Imagine the power of combining those highly innovative models in the future with the capabilities of LLMs.
This describes a development we are currently observing in many areas: the combination of domain‑specific AI models with large language models opens up entirely new possibilities to make complex relationships accessible and to connect knowledge across disciplines and data sources. At the same time, this combination creates entirely new interfaces for interacting with data, models, and applications. Natural language increasingly becomes the universal interface through which complex analyses, workflows, and systems can be controlled.
Our Perspective: Multimodal Models for the Language of Life
For us as scieneers, this topic is particularly exciting. As part of research at the RWTH Aachen University Hospital, in which we are fortunate to participate, we are also working on multimodal approaches.
We have developed a model called Genolator that jointly processes and combines information from DNA sequences, amino acid sequences, protein structures, and natural language. The goal is to answer questions in natural language about the molecular function or biological processes associated with given DNA sequences.
This creates a first foundation for interacting with our genome using natural language and potentially opens up new ways for future models and research to generate deeper insights into genetic mechanisms.
Digital Sovereignty as a Central Theme
Beyond visionary research topics, another key theme of the AI Tour was digital sovereignty.
In politically and economically turbulent times, many organizations are increasingly seeking greater control over their data, infrastructure, and regulatory compliance. Microsoft presented a multi‑layered concept for sovereign cloud architectures.
Sovereign Public Cloud
One layer is the Sovereign Public Cloud. In this model, organizations remain within the public cloud environment while ensuring that their data and workloads are hosted in regional data centers to meet regulatory and data residency requirements. Initiatives such as the EU Data Boundary ensure that sensitive European customer data is processed and stored within the EU.
In addition, technologies such as virtual private networking, Bring Your Own Key (BYOK) for customer‑controlled encryption keys, and confidential computing provide further security and control mechanisms to protect data – even during processing. Together with local operational and partner structures, this approach combines the scalability of the public cloud with stronger sovereignty and compliance guarantees.
If you would like to learn more about this approach, you can find additional information here.
Sovereign Private Cloud
For organizations with even stricter requirements, Microsoft introduced the Sovereign Private Cloud. This is a fully customer‑operated cloud platform that brings modern Azure Services directly into an organization’s own data centers. Technologies such as Azure Local and Microsoft 365 Local enable core cloud services such as compute, storage, Kubernetes clusters, and collaboration services to run fully on‑premises, in hybrid environments, or even in fully isolated infrastructures.
What is particularly interesting is that modern AI platforms such as Azure AI Foundry can also be operated in such environments. This makes it possible to deploy and run large language models and AI workloads directly on an organization’s own infrastructure – provided that sufficient compute resources are available.
For organizations with strict data governance requirements or highly sensitive data environments, this opens new opportunities to leverage modern AI technologies while maintaining full control over data and systems.
This approach is especially relevant for government institutions, critical infrastructure operators, and highly regulated industries.
You can learn more about this architecture here.
National Partner Clouds
This model is complemented by so‑called National Partner Clouds. One example is Delos, a subsidiary of SAP, which operates a cloud environment designed to meet the requirements of the German Federal Office for Information Security (BSI). Currently the focus is primarily on public sector organizations, but such models may also become relevant for companies operating in regulated industries.
If you are interested in learning more about transparency and governance in this context, Microsoft also publishes regular transparency reports. These reports provide insights into how the company handles government data requests and content removal requests.
If you are curious, you can take a look here: Transparency Reports
AI in Business Processes
Beyond infrastructure topics, there were also several examples of how AI is already being integrated into business processes today. Companies such as Lufthansa demonstrated how LLM‑based agents are already supporting and automating operational workflows.
The conversation is therefore shifting, from asking what is technically possible to asking where AI already delivers measurable business value.
From Low‑Code to AI‑Generated Development
From Low‑Code to AI‑Generated Development
Another interesting development concerns the Power Platform. A clear transition is emerging: moving away from traditional low‑code approaches toward AI‑driven generative development.
Applications can increasingly be generated automatically as React and TypeScript code. Developers can further refine this code in their preferred development environments before deploying it back to the Power Platform, where it benefits from features such as Entra authentication and deep integration into the Microsoft ecosystem.
Especially exciting: this new “Vibe Coding” experience within the Power Platform can already be explored today in a beta version. Access is currently limited to a US environment, but this can be provisioned quickly and in isolation alongside your existing environment within just a few minutes. If you’re curious to try it out yourself, you can take a look here: Vibe Coding PowerApps.
Excel Is Experiencing a Revival
A small but notable detail also caught attention: Excel seems to be experiencing something of a revival. With the integration of Copilot and new agent capabilities, Excel is increasingly becoming an intelligent interface for working with data.
Particularly interesting is the ability for agents to aggregate data from multiple sources directly within Excel – for example from SharePoint, internal document repositories, or even SQL databases. This increasingly turns Excel into a central interface where data from different systems can be combined, analyzed, and explored using AI.
Further Observations: AI, Accessibility, and a Look Behind the Scenes
Another interesting observation during the AI Tour was how AI was used to improve linguistic accessibility. All talks and panel discussions during the event were translated live and simultaneously, allowing attendees to follow the sessions in both German and English in real time.
A small fun fact from the first session in the morning: the model apparently still needed its first coffee. During the opening talk, the system briefly spoke in its own mysterious language before the translation kicked in properly. After this small hiccup, however, the simultaneous translation worked reliably in both directions – German ↔ English.
Even large tech companies are not immune to small growing pains when deploying new technologies. At the same time, the demonstration clearly showed how much potential AI already has to make events and content more accessible for an international audience.
Another observation I found particularly interesting was that Microsoft relied on Anthropic models for some of the coding‑agent product demonstrations rather than OpenAI models. This once again highlights how open and dynamic the current AI ecosystem is and that in many scenarios the combination of different models and providers creates the greatest value.
Conclusion
Our conclusion after this day in Munich: AI is rapidly evolving from a technology used for individual applications into a foundational infrastructure for business, research, and organizations.
The most exciting developments will likely come from the combination of domain‑specific models, large language models, and domain‑specific data.
The Microsoft AI Tour clearly demonstrated how broad the field has already become – from medical research and sovereign cloud architectures to concrete business applications.
And it once again made clear that many of the ideas currently being explored in research and industry are already beginning to become reality and that the current wave of developments around large language models is clearly here to stay.
Author
Martin Danner, Data Scientist at scieneers GmbH
martin.danner@scieneers.de











