The Natural Language Processing Group at the University of Edinburgh (EdinburghNLP) is a group of faculty, postdocs, and PhD students working on algorithms that make it possible for computers to understand and produce human language. We do research in all core areas of natural language processing, including morphology, parsing, semantics, discourse, language generation, and machine translation. EdinburghNLP also has a strong track record of work at the interface of NLP with other areas, including speech technology, machine learning, computer vision, cognitive modeling, social media, information retrieval, robotics, bioinformatics, and educational technology.

With 11 core faculty members, EdinburghNLP is one of the largest NLP group in the world. It is also ranked as the most productive group in the area, according to csrankings.org. Our achievements include the award-winning neural machine translation system Nematus and the high-performance language modeling toolkit KenLM. EdinbughNLP faculty have a strong record of getting high-profile grants, and have so far won a total of five European Research Council (ERC) grants.

We are looking for new PhD students! Join us. Also, please check out the new UKRI Centre for Doctoral Training in Natural Language Processing!

We are hiring new faculty! See here the job advertisement.

news

This week we are thrilled to be presenting 14 papers at @emnlpmeeting in the main conference sessions! If you are interested in our work or have any questions or comments, please reach out! #EMNLP2022 🧵 1/N

The School of Informatics at the University of Edinburgh @InfAtEd is recruiting lecturers/readers (assistant/associate profs) in three NLP-adjacent areas:
Speech Technology (ddl 12 Jan), Computational Cognitive Science (ddl 12 Jan), Machine Learning (ddl 10 Jan). Job adverts:

Nonparametric Learning of Two-Layer ReLU Residual Units

Zhunxuan Wang, Linyun He, Chunchuan Lyu, Shay B Cohen

https://openreview.net/forum?id=YiOI0vqJ0n

#NewPaper #PaperPost

💥KG Embedding Models are GNNs! 💥 We found that their training dynamics can be described via message-passing operations, enabling us to define a new type of GNNs that achieves SOTA results on inductive and transductive link prediction tasks! #NeurIPS22 https://arxiv.org/abs/2207.09980 🧵

Load More...