In the rapidly evolving landscape of machine learning research, neural networks stand out with their ever-expanding number of parameters and reliance on increasingly large datasets. The financial cost and computational resources required for the training p ...
Buildings play a pivotal role in the ongoing worldwide energy transition, accounting for 30% of the global energy consumption. With traditional engineering solutions reaching their limits to tackle such large-scale problems, data-driven methods and Machine ...
In this dissertation, we propose multiple methods to improve transfer learning for pretrained language models (PLMs). Broadly, transfer learning is a powerful technique in natural language processing, where a language model is first pre-trained on a data-r ...
Natural language processing has experienced significant improvements with the development of Transformer-based models, which employ self-attention mechanism and pre-training strategies. However, these models still present several obstacles. A notable issue ...
Purpose: Despite being an authentic carrier of various cultural practices, the human body is often underutilised to access the knowledge of human body. Digital inventions today have created new avenues to open up cultural data resources, yet mainly as appa ...
Various endeavours into semantic web technologies and ontology engineering have been made within the organisation of cultural data, facilitating public access to digital assets. Although models for conceptualising objects have reached a certain level of ma ...
Recent transformer language models achieve outstanding results in many natural language processing (NLP) tasks. However, their enormous size often makes them impractical on memory-constrained devices, requiring practitioners to compress them to smaller net ...
Time has always been a central factor in understanding the challenges of daily mobility. For a long time, and still today, methods of economic evaluation of transport projects have monetized time savings so that they can be included in the cost–benefit ana ...
The recent developments of deep learning cover a wide variety of tasks such as image classification, text translation, playing go, and folding proteins.
All these successful methods depend on a gradient-based learning algorithm to train a model on massive ...
Abstractive summarization has seen big improvements in recent years, mostly due to advances in neural language modeling, language model pretraining, and scaling models and datasets. While large language models generate summaries that are fluent, coherent, ...