Our papers are the official record of our discoveries. They allow others to build on and apply our work. Each one is the result of many months of research, so we make a special effort to make our papers clear, inspiring and beautiful, and publish them in leading journals.
- Date
- Subject
- Theme
- Journal
- Citations
- Altmetric
- SNIP
- Author
- T. Fink
- O. Gamayun
- A. Esterov
- Y. He
- F. Sheldon
- A. V. Kosyak
- A. Ochirov
- E. Sobko
- M. Burtsev
- M. Reeves
- I. Shkredov
- G. Caldarelli
- R. Hannam
- F. Caravelli
- A. Coolen
- O. Dahlsten
- A. Mozeika
- M. Bardoscia
- P. Barucca
- M. Rowley
- I. Teimouri
- F. Antenucci
- A. Scala
- R. Farr
- A. Zegarac
- S. Sebastio
- B. Bollobás
- F. Lafond
- D. Farmer
- C. Pickard
- T. Reeves
- J. Blundell
- A. Gallagher
- M. Przykucki
- P. Smith
- L. Pietronero
Machine learning
The limits of LLMs
Large language models like ChatGPT can generate human-like text but businesses that overestimate their abilities risk misusing the technology.
Machine learning
DeepPavlov dream
A new open-source platform is specifically tailored for developing complex dialogue systems, like generative conversational AI assistants.
Computational linguistics
Cross-lingual knowledge
Models trained on a Russian topical dataset, of knowledge-grounded human-human conversation, are capable of real-world tasks across languages.
Machine learning
Speaking DNA
A family of transformer-based DNA language models can interpret genomic sequences, opening new possibilities for complex biological research.
Machine learning
BERT enhanced with recurrence
The quadratic complexity of attention in transformers is tackled by combining token-based memory and segment-level recurrence, using RMT.