Posts

Q.ANT raises €62 million to transform the future of computing with photonic processing

Q.ANT, a pioneer in photonic processing, today announced a €62 million Series A financing round to accelerate the commercialization of its energy-efficient photonic processors for artificial intelligence (AI) and high-performance computing (HPC).

Deep Tech Rising Star: EXF Alpha Fund Invests in Photonic Computing Leader Q.ANT’s €62M Series A

EXF Alpha, the syndication fund managed by Venionaire Ventures S.à r.l., a subsidiary of Venionaire Capital AG, has joined the €62 million Series A funding round of Q.ANT. This German deep tech company is transforming computing with photonic processing technology designed for AI and high-performance computing (HPC).

A Strategic Investment in Next-Generation Computing

Q.ANT develops cutting-edge photonic processors that outperform traditional CMOS-based chips. Their technology delivers better energy efficiency and scalable performance – two essential features for growing AI and HPC infrastructure. As global data demands rise, Q.ANT’s solutions help reduce energy consumption while increasing computing power.

This funding round was co-led by Cherry Ventures, UVC Partners, and imec.xpand. Additional investors include L-Bank, Verve Ventures, Grazia Equity, LEA Partners, Onsight Ventures and TRUMPF.

Q.ANT: The Photonic Computing Leader

Based in Stuttgart, Germany, Q.ANT GmbH is a spin-off of TRUMPF, a global leader in industrial manufacturing. The company designs photonic chips that drive faster, greener data processing. Its innovations power applications in AI, sensing, and medical diagnostics.

“This investment proves that Europe has both the ambition and the capital to lead,” said Dr. Michael Förtsch, founder and CEO of Q.ANT. “It also connects us with strong partners who share our mission to shape the future of computing.”

By replacing electrons with light, Q.ANT builds processors that are not only faster, but also more sustainable.

Supporting Europe’s Deep Tech Ecosystem

At Venionaire Capital, we back European technologies that solve global challenges. Q.ANT’s innovation is a great example. Their processors offer data centers a way to boost performance while cutting energy use.

“We focus on deep tech from Europe with the potential for global impact,” said Berthold Baurek-Karlic, CEO of Venionaire Capital and Managing Director of Venionaire Ventures S.à r.l. “Q.ANT’s photonic architecture offers exactly that: performance, efficiency, and scalability.”

This investment fits our strategy to support breakthrough technologies that promise long-term value.

Aligning With Our Mission

We believe Q.ANT is well-positioned to lead a new era in computing. Their work shows that innovation and sustainability can go hand in hand.

We’re proud to support Dr. Michael Förtsch and his team as they redefine how the world thinks about computing power.

The New Rules of AI: Execution, Speed and Specialization

After more than a decade in the AI space, it’s clear that we’ve entered a fundamentally new phase — one where foundational models, open-source acceleration, and application-layer innovation are reshaping the rules of competition. The pace of change over the past two years has been unprecedented, and the assumptions that defined the “first wave” of AI no longer hold. 

 

From Deep Tech to Agile Engineering 

In the first wave of modern AI (roughly 2012–2021), competitive advantage was rooted in deep technical expertise. Founding teams were often led by PhDs in computer science or applied mathematics, and startups differentiated themselves by building proprietary models. The default assumption — shared by founders and investors alike — was that access to talent, compute, and data could create defensible intellectual property. 

That assumption no longer holds. 

With the rise of foundation models like GPT-4, Claude, LLaMA, and Mistral, we now have general-purpose systems with strong performance across a wide range of tasks. These models function as powerful abstraction layers — analogous to what Amazon Web Services or React did for web development. You no longer need to build the engine; you need to understand how to drive it effectively. 

 

The Open-Source Shift 

Open-source models have fundamentally altered the innovation landscape. Meta’s open-weight models, Mistral’s high-performance alternatives, and open image segmentation frameworks are enabling companies of all sizes to build sophisticated AI applications without massive R&D investments. Another good example is the case of Deepseek, which we covered earlier this year.

In view of the current AI race, this shift has several implications: 

  • Investor focus is moving up the stack, from infrastructure to use-case execution. 
  • Startups no longer need deep ML research teams — they need engineers who can integrate, fine-tune, and build useful products. 
  • IP is now built on data and workflows, not on proprietary model code. 

These developments are democratizing access but also compressing the window for defensibility. In AI today, first-mover advantage is fleeting unless paired with deep market understanding and fast iteration cycles. 

 

The Bottom-Up Transformation of Enterprise AI 

In contrast to the previous top-down enterprise AI adoption — where executives pursued cost optimization or process automation — we’re now seeing a bottom-up wave of implementation. Employees are increasingly using LLM-powered tools independently, leading to the rise of so-called “shadow AI” within large organizations. 

This mirrors the early SaaS revolution, where departments deployed their own solutions long before IT officially approved them. For AI, this shift could redefine how large enterprises approach innovation — making it more agile, decentralized, and iterative. 

Value-Based Pricing: A New Commercial Paradigm 

The economics of AI do not align neatly with traditional SaaS models. High inference costs, energy consumption, and the need for constant retraining complicate standard subscription pricing. This is leading to a reevaluation of pricing strategies, with value-based pricing emerging as a viable alternative. 

This model ties cost to measurable outcomes — such as leads generated, time saved, or content produced — and is already being tested in domains like sales enablement and customer support. It aligns well with agent-based architectures and dynamic workload distribution, where usage and value vary significantly. 

 

The Hardware Bottleneck — and Photonic Computing 

While software has accelerated, hardware is now the limiting factor. GPUs dominate the current compute landscape, but their power consumption and supply constraints are unsustainable at scale. 

One of the most promising developments in this area is photonic computing. Unlike traditional chips, photonic processors use light to perform calculations, drastically reducing energy usage and heat generation. Several European companies — including Germany-based Q.ANT — are developing photonic AI hardware, including plug-and-play PCIe cards designed for local model inference. 

Photonic chips are particularly well-suited for matrix-heavy AI tasks and could be instrumental in the next phase of model deployment, especially at the edge or in energy-sensitive environments. 

 

Toward Domain-Specific AI Applications 

Another key development is the narrowing of focus within AI startups. Early-stage ventures are moving away from vague platform ambitions (“LLMs for healthcare”) and instead focusing on very specific workflows where value can be clearly demonstrated and measured. 

The most promising teams today are those that combine engineering capability with deep subject matter expertise, whether in medicine, law, logistics, or manufacturing. This trend points toward a more fragmented but robust AI startup ecosystem — one where the winners are not generalists, but specialists who understand both the model and the market. 

 

Simultaneous forces 

The landscape of AI is being reshaped by several simultaneous forces: 

  • The commoditization of model development 
  • Shifting business models and pricing strategies 
  • Hardware constraints and emerging alternatives 
  • A return to domain-driven innovation 

For founders, the message is clear: building competitive advantage today means moving fast, understanding your users deeply, and leveraging existing infrastructure intelligently. For investors, it means focusing less on technical novelty and more on execution, traction, and sustainable go-to-market strategies. 

In our recent episode of Let’s Talk About Tech, host Berthold Baurek-Karlic spoke with AI expert Clemens Wasner, founder of EnliteAI and chair of AI Austria, about all the core shifts shaping the current AI landscape. Today, competitive advantage is increasingly defined by speed, domain expertise, and the ability to ship and iterate quickly. As Wasner noted, “the actual competition no longer takes place on the model level, but on what you do with the model.”  

Listen to the full episode here: 

WHERE TO FIND US

VIENNA, OFFICE (HQ)

Babenbergerstraße 9/12,
A-1010 Vienna, Austria (EU)
office@venionaire.com

SAN FRANCISCO, USA

1355 Market St. #488
San Francisco CA 94103
sfo@venionaire.com

NEW YORK CITY, USA

122 East 37th Street
First Floor
New York, NY 10016
nyc@venionaire.com

LONDON, UK

Gable House, 239 Regents Park Road
London N3 3LF
office@venionaire.com

MUNSBACH, LUX

3 rue Gabriel Lippmann
5365 Munsbach
office@venionaire.com

LOOKING FOR FUNDING?

FOR STARTUPS

Venionaire Capital exclusively invests through the European Super Angels Club, for more information and application please go to the website. We do not accept direct investment proposals via this website.