Our Open Source Contributions
Over the years, we have published parts of our code and our AI models.
We have summarised these contributions on this page.
Multilingual Passage Reranking Modell
- This model is a reranking model trained by us, which can evaluate content according to various factors on an LLM basis.
- It has been rated as one of the most efficient of its kind in various scientific papers:
- It now has a medium, six-digit number of downloads
High quality, German training data set
- High-quality data sets are required to train AI models and avoid certain biases.
- Together with Philipp May, T-Systems, we have defined this training dataset to train our LLMs.
German Electra Model
-
Together with Philip May, T-Systems, we have published an improved Electra model, which was trained with 1.5 million steps.
-
At the time of publication, it was briefly the most efficient German-language LLM.
Leading companies love amberSearch
Read interesting posts on this topic
Why an enterprise search is the most sensible basis for any AI system in a company
Enterprise search is necessary to integrate AI into company processes in the long term. This article explains the role of enterprise search.
What is a multi-agent system?
Multi-agent systems are the logical development of classic AI agents. You can find out what these systems can do and what their purpose is in this blog article.
DB Regio AG – Revolutionising customer service together
On 6 September 2023, amberSearch was invited to Station Berlin to present a solution on how to automate customer dialogue with the help of generative AI at the Zukunft Nahverkehr trade fair as part of the DB mindbox Accelerator. This pitch was won by amberSearch. Here is a recording of the pitch:
Direkt starten!
Beginne noch heute mit dem ersten Schritt, um die Vorteile von amberSearch zu nutzen.