We help you to get started.
With container and AI.
Our Approach
In many industries, it is important to protect intellectual property. Entrepreneurs do not want data, codes and AI models to leave the campus – which rules out a standard cloud solution.
Nevertheless, it is possible to run AI models on-site if they are cleverly packaged. This allows you to reap the benefits of AI and take the first step towards digitising your processes.
Start today in your own data centre and decide tomorrow what you want to put in the open cloud and what should remain in the safe haven. Contact us for an initial consultation.
A LLM for chemical reasoning. On premise.
In summer 2025 ether0 was released - a large language model for reasoning in chemical science. Too big to run it locally? Not if you know how to do it.
We encapsulated it into a docker image and placed it successfully on a local appliance, running Linux with two standard GPU: The model worked fine on our appliance and answered our questions fluently.
Considering a trial placement? Then please contact us.
A digital lab assistant with speech recognition that follows your every command. Without the cloud.
As part of a proof of concept, we had the opportunity to work directly in a chemical laboratory. The task was to enhance a glovebox with a voice-based digital assistant. All software components had to run on a device in the laboratory.
Using open source components, we transferred the main components – speech-to-text, text-to-speech and the chatbot – into containers on a simple Raspberry PC running the Linux operating system. The user spoke to the assistant via a Bluetooth headset, leaving his hands free for work, while the digital assistant took control of the glove box and carried out the commands given to it.
All this was achieved with 100% local components. No GDPR issues, no lengthy approval processes with the IT department, no hassle with the union.
Just getting the job done.