Categories
Application Interactive HPC Research Supercomputing

Introducing RAGFlow: Enabling Smarter Research with AI-Powered Search

A new open-source application is now available on UCloud, designed for students, researchers, and educators working with complex data and artificial intelligence. RAGFlow – short for Retrieval-Augmented Generation – combines powerful language models with your own academic materials, offering an intelligent way to search, explore, and interact with content.

Whether you’re conducting a literature review, developing a teaching assistant, or building a domain-specific chatbot, RAGFlow provides an intuitive pipeline that transforms unstructured documents into a searchable, AI-ready knowledge base. But RAGFlow is more than just question-answering. It supports the creation of custom workflows and intelligent agents, enabling advanced interactions, data processing, and tool integration – all within a flexible and transparent environment.

What can you do with RAGFlow?

RAGFlow helps large language models (LLMs) generate accurate answers based on real data – not just pre-trained knowledge. It’s built to close the gap between raw academic material and useful insight.

RAGFlow is designed with both beginners and advanced users in mind. At its simplest, you can just upload documents and start asking questions. The interface guides you through the basics, so you can get useful results straight away.

As your needs grow, you can delve deeper into advanced features such as custom chunking, retrieval tests, datasets, and programmable workflows. Comprehensive documentation and tutorials are available, allowing you to learn at your own pace and expand your use of the platform over time.

Key Features:
  • Data Ingestion & Chunking:
    Upload PDFs, text files, webpages and more. RAGFlow automatically breaks them into manageable parts.
  • Embedding & Indexing:
    These chunks are converted into vector representations so they can be searched by meaning, not just keywords.
  • Smart Retrieval:
    When you ask a question, the system finds the most relevant information.
  • Contextual Generation:
    An LLM uses this context to generate well-informed responses.
  • Cited Sources:
    All answers come with grounded citations, showing where the information came from — supporting transparency and academic rigour.

This process improves the quality of responses and significantly reduces the risk of hallucinated or misleading answers.

From Search to Workflow: Introducing Agents

Beyond document search, RAGFlow also allows you to build and customise your own AI-powered agents. These agents can search, analyse, and use tools on your behalf – forming a pipeline tailored to your specific research needs.

So, what is an agent?

Think of an agent as a specialised AI assistant. You might create one to retrieve data from a source, another to analyse it, and a third to generate a written summary or report. These agents can be chained together into a programmable pipeline – a step-by-step flow where each agent passes its output to the next.

For example, you could build a research assistant that:

  • Searches for academic papers on a topic
  • Extracts and summarises the most relevant findings
  • Runs basic statistical analysis
  • Outputs the results as a draft report

Unlike typical ‘black-box’ AI tools, which conceal their inner workings, RAGFlow provides full transparency, allowing you to understand exactly how your AI operates. You can inspect, adjust, and understand every stage – from document chunking to embedding, retrieval, and agent reasoning. It’s a flexible and reproducible platform where your agents can be saved, re-run, or even shared with colleagues.

Why use RAGFlow on UCloud?

RAGFlow is available directly on UCloud. This offers several key advantages:

  • Academic Use Cases:
    Build assistants for teaching, research discovery, or even entire knowledge bases for your institute or research centre.
  • No Installation Required:
    Launch RAGFlow on UCloud with everything preconfigured and ready to use.
  • Flexible AI Model Support:
    Choose from models hosted on Hugging Face, Ollama, or take advantage of GPU-accelerated inference with vLLM – all accessible via an API key.
  • Easy Document Management:
    Upload and manage a wide range of formats, including PDFs, scanned documents, spreadsheets, and HTML.
Learn more 

Guides and technical details:
RAGFlow Guide
RAGFlow documentation on UCloud

A recorded tutorial will also be available shortly. Sign up for the newsletter to receive updates on this and other Interactive HPC news.

Categories
Interactive HPC Research Supercomputing Teaching UCloud Use case

UCloud Provides Student Access to Advanced NLP in Teaching 

In the Master’s programme in Cognitive Science at Aarhus University, UCloud plays a central role in teaching Natural Language Processing (NLP). For instructor and PhD student Mina Almasi, the platform is essential in enabling students to work hands-on with complex models – regardless of the limitations of their own computers.

From Theory to Hands-On Learning 

In a white classroom in Nobelparken, Mina stands in front of 15 students. On the screen behind her, lines of Python code appear in neat, symmetrical rows as she explains which code libraries the students need to access.

In her teaching, she uses the Coder Python application in UCloud because the course is based on Python programming. But the choice of platform is not just about software – it is about giving students the opportunity to translate theory into practice.

According to Mina, NLP teaching previously tended to remain at a more theoretical level, due to limited access to both models and the computing power needed to test theories in practice – especially when it came to large language models. With UCloud, students can now work directly with language models (LLMs) and make use of powerful GPUs and CPUs. This allows them to test theories themselves and experiment hands-on with the tools they are learning about.

“We still teach the theory, but now we can also have students use the tools in practice. They can code on their own and gain insight into how a large language model works by working directly with it through UCloud,” she explains.

A Standardised Setup that Democratises the Classroom 

Another advantage of using UCloud in NLP teaching is that the platform ensures equal access for all students, regardless of the computer they own.

“There is a kind of democratisation of the classroom, because you don’t need the latest computer. You can use a five-year-old machine to run very heavy tasks that the newest tools in Natural Language Processing require,” she explains.

At the same time, the standardised setup makes teaching more seamless. All students work with the same standard configuration in UCloud, so any issues that arise are the same for everyone. This creates a shared sense of problem-solving, as challenges can be addressed collectively rather than handled individually by students on their own. As Mina puts it:

“Instead of stopping the lesson to solve individual problems, the problems become collective and an opportunity for learning for everyone. If we have a software issue – for example, a Python library version that is outdated or incompatible – it affects everyone, and we can solve it together.”

Preparing Students for Working Life

For Mina, using UCloud also helps prepare students for the reality that awaits them after graduation. According to her, many of the students who go on to IT positions will likely use cloud computing platforms rather than coding on local machines. In this way, the teaching becomes direct preparation for future job tasks and gives students experience with the technologies they will encounter in practice.

Advice for Other Instructors 

Mina has used UCloud since her bachelor’s degree and finds that the platform makes teaching both smoother and more engaging.

“I recommend that other instructors make use of the platform. You just have to get started – but feel free to ask colleagues for advice on how they use it. Get some inspiration, because UCloud is a fantastic tool. It can do a great many things, but like other systems, it can feel a bit overwhelming at first, so it’s a good idea to get some guidance along the way before you begin.”

Categories
Call Interactive HPC Research Supercomputing UCloud

H2-2026 National HPC Call is open

You can now apply for compute time on UCloud. DeiC has opened the first 2026 call for applications for access to Denmark’s national HPC facilities – and Interactive HPC – UCloud is part of this call.

So if your research needs extra compute resources on UCloud, now is the time to apply. These calls only open twice a year, so this is a great opportunity to consider applying in this round. Researchers (and PhD students) at Danish universities can apply.

Key dates

  • Call opens: 13 January 2026
  • Application deadline: 10 March 2026
  • Resources available from: 1 July 2026

Read more and apply via DeiC

Categories
Application Interactive HPC Research Supercomputing Tutorial Workshop

Workshop 26/2: CVAT – AI-Assisted Labeling

Date: February 26, 2026

Time: 13:15 to 14:30 CET

Location: Online via Zoom

CVAT, Computer Vision Annotation Tool, is an interactive video and image annotation tool, designed to facilitate the annotation of video and image data and accelerate the creation of high-quality datasets for computer vision tasks. CVAT is available on the UCloud platform, in the Application Store.

The webinar will show how to use CVAT on UCloud to:

Label and annotate data with the help of AI and OpenCV tools, including:

  • Use of cvat-cli
  • Run built-in model for detection and auto-annotation
  • Use of GPUS with built in models for faster annotation
  • Adding custom models (e.g. YOLO)

Efficiently manage large visual datasets with MinIO:

  • Allow CVAT to directly pull images from your UCloud MinIO buckets for annotation and export annotated data back, reducing manual imports/exports and ensuring data availability.

Using UCloud allows users to create fully reproducible and secure workflows that leverage high performance computing resources. Those features are often necessary for large dataset and accurate computer vision tasks.

Target audience: Researchers across all Departments, particularly who require high-precision data labeling, AI interested.

Technical Level: Basic to Intermediate

Sign up for the CVAT workshop

Categories
Interactive HPC Research Supercomputing

UCloud and Digital Sovereignty in focus during Ministerial Visit

On 27 October, Minister for Digital Affairs Caroline Stage Olsen visited the University of Southern Denmark (SDU) to learn more about UCloud and the Interactive HPC Consortium. 

The visit aimed to showcase how Danish research contributes to strengthening Denmark’s digital independence and sovereignty. The Department of Mathematics and Computer Science (IMADA) and the SDU eScience Center at the Faculty of Science were pleased to welcome the Minister to SDU.

During her visit, the Minister was introduced to UCloud, an open-source cloud platform operated by the Interactive HPC Consortium. Originally developed by SDU, UCloud has been available since 2019 via the DeiC Interactive HPC service to all researchers in Denmark. Today, the consortium behind UCloud comprises SDU, AU, and AAU, who jointly develop and operate the platform.

The Minister emphasised that digital sovereignty and the development of cloud solutions under Danish control are key priorities for the government:

”This is something we are increasingly discussing – how we can become more independent and strengthen our control over digital infrastructure. That is part of what I am learning about today,” said Caroline Stage Olsen, Minister for Digital Affairs, during her visit.

Building Bridges Between Research and Society

UCloud serves as Denmark’s national platform for interactive high-performance computing (HPC) and is Europe’s most widely used research supercomputing platform. With more than 18,000 users across universities, public authorities, and private companies, it stands as a clear example of how Danish-developed solutions can promote digital self-reliance.

“True digital sovereignty requires public infrastructure you can inspect, control, and improve. UCloud turns sovereignty from a slogan into a living, open-source public good — Europe’s largest research cloud built in Denmark. Investing in open infrastructure like UCloud is how we can secure our digital future,” said Professor Claudio Pica, Head of the SDU eScience Center.

A Responsibility Towards Society

The visit also prompted a broader dialogue about the responsibility of research institutions in an era where digitalisation permeates every aspect of society – from healthcare and education to the energy sector and public services.

The visit concluded with a tour of SDU’s supercomputing facilities, where the Minister was introduced to the advanced infrastructure that supports Interactive HPC – UCloud.

This article is based on an original story published on SDU’s website.

Categories
Interactive HPC Research UCloud Use case

DeiC Interactive HPC Crucial for Danish AI Language Models

By Jasper Riis-Hansen and Line Ejby Sørensen, Center for Humanities Computing (CHC), Aarhus University

DeiC Interactive HPC – UCloud plays a central role in the Danish Foundation Models (DFM) project, which forms part of the Danish government’s strategic initiative for artificial intelligence.

Danish Foundation Models (DFM) is supported by the Ministry of Digital Affairs as part of the national AI strategy, which aims to ensure that Denmark has access to advanced and tailored language models. These models are intended for use across a wide range of sectors, including healthcare, public administration, education, and private enterprise.

A shared digital environment

The DFM project brings together Danish universities, research institutions, and industry partners in a joint effort to establish new standards for ethically responsible and inclusive AI language technologies.

The project is a collaboration between Aarhus University, the University of Copenhagen, the University of Southern Denmark, and the Alexandra Institute. DeiC Interactive HPC – UCloud plays a vital role in this work by providing high data security, scalable computing power, and, not least, an accessible, secure, national cloud platform that enables collaboration among project partners.

“UCloud forms the foundation for an important step in research digitalisation, as the platform provides easy access to computing power, enabling scalable data analysis and modelling, while also offering a secure environment for handling sensitive data. The platform also facilitates collaboration across institutions and allows us to manage data access as needed. This is particularly relevant in the DFM project, which includes many partners participating at different levels.”
Postdoc Kenneth Enevoldsen

Data security and computing power

Because AI models are often trained on sensitive data, it is crucial that data processing complies with both GDPR and Danish security standards. UCloud is ISO27001-certified and specifically designed to meet both Danish and EU requirements for secure data handling.

“In the DFM project, we work with very large amounts of data from a variety of sources – including sensitive data that the models are trained on – and this places high demands on data security. That is why UCloud is such a valuable tool for the project – precisely because of its high level of data security and access to scalable computing power.”
Postdoc Kenneth Enevoldsen

Although DFM also makes use of European supercomputers such as LUMI in Finland and Leonardo in Italy, the day-to-day operations of the project are heavily reliant on UCloud. In addition to being a springboard for high-performance computing, UCloud also provides a secure and user-friendly platform with a wide range of accessible applications – all essential for daily research, collaboration, data processing, and innovation across the project’s interdisciplinary team.

Critical infrastructure for Danish AI development

DFM’s principal investigators, Kristoffer Nielbo and Peter Schneider-Kamp, emphasise that the robust digital research environment provided by DeiC Interactive HPC – UCloud constitutes critical infrastructure. It streamlines workflows, enhances collaboration, and accelerates the development of both language and AI technologies.

“Without UCloud, the DFM project would have had to develop this type of digital infrastructure itself – with significant time and financial costs. The platform’s role in the project clearly demonstrates how robust, collaborative digital research environments are essential to Denmark’s AI strategies.”

Danish Foundation Models (DFM) is a collaborative project involving Aarhus University, the University of Copenhagen, the University of Southern Denmark, and the Alexandra Institute.

The project is supported by the Ministry of Digital Affairs with a grant of DKK 30.7 million and aims to develop advanced language models with open access and transparent development processes.

The models are specifically tailored to Danish and other Scandinavian languages and cultures and are intended for use across sectors such as healthcare, public administration, education, and business.

DFM seeks to establish a new standard for ethically responsible, inclusive, and transparent AI language technology – for the benefit of both Danish society and the research community.

For more information, visit: Danish Foundation Models, Ministry of Digital Affairs press release

Categories
Interactive HPC Research Supercomputing UCloud

DeiC Interactive HPC Revolutionises Interdisciplinary Research with User-Friendly Supercomputing Access

With 10,000 users, DeiC Interactive HPC has established itself as one of Europe’s most popular HPC facilities, thanks to an unprecedented democratisation of access to advanced computing resources. These resources, once reserved for specialised research fields and technically adept specialists, are now accessible to any researcher with a dataset and a vision.

Through a newly developed, simple, and graphical user interface, DeiC Interactive HPC, also known as UCloud, makes it easier than ever to gain interactive access to supercomputing. This approach reduces technical barriers and enhances research collaboration by offering shared, easily accessible virtual environments. As a result, DeiC Interactive HPC supports dynamic and interdisciplinary research, accelerating research processes and promoting innovation in fields ranging from bioinformatics to digital humanities.

Democratising Access to HPC

The trend towards more interactive use of technology, including HPC, reflects efforts to make the STEM field more inclusive and accessible, mirroring broader societal changes towards diversity and inclusion in technology and science. DeiC Interactive HPC’s user-friendly approach has attracted a broad spectrum of users, including those from nearly all Danish universities and individuals with varying levels of technical expertise, notably many students.

“We are proud to highlight the growing diversity among DeiC Interactive HPC users, a development that further distinguishes DeiC Interactive HPC from traditional HPC systems. We see continuous growth in user numbers and are now celebrating surpassing 10,000 users across a very broad spectrum of research disciplines, which is impressive in the HPC field. Of these users, 50% are students, reflecting DeiC Interactive HPC’s success in attracting new users and serving as a bridge to larger European HPC facilities,” says Professor Kristoffer Nielbo, representing Aarhus University in the DeiC Interactive HPC Consortium.

By simplifying access to supercomputers, DeiC Interactive HPC democratises powerful data processing resources, enabling a wider range of researchers and academics to conduct innovative research without the steep learning curve traditionally associated with HPC. This inclusivity fosters scientific collaboration and creativity, enriching the HPC community with a diversity of perspectives and ideas.

“We continuously work to improve DeiC Interactive HPC with a democratic approach, using user feedback to ensure our focus is in the right place. This is also reflected in our new update – UCloud version 2 – which aims to increase efficiency and improve the user experience for researchers. It is part of our DNA as an interactive HPC facility to always keep the user in mind and develop apps and user interfaces based on user needs. Therefore, we encourage our users to reach out to us with their wishes and ideas,” says Professor Claudio Pica, representing the University of Southern Denmark in the DeiC Interactive HPC Consortium.

An All Danish and Highly Secure System

Despite its internationally sounding name, UCloud, DeiC Interactive HPC is part of the Danish HPC landscape, funded by Danish universities and the Ministry of Education and Research. The increased focus on developing a new generation of highly user-friendly applications means that researchers and other university staff can now use intuitive applications for transcribing sensitive data via DeiC Interactive HPC.

“DeiC Interactive HPC has already developed applications based on the same transcription technology found online and made them available in a secure environment through the UCloud platform. These transcription applications are just the beginning of a series of targeted secure applications that do not require prior experience, and we are always open to user input and ideas that arise from their unique needs but often prove beneficial to many,” says Lars Sørensen, Head of Digitalisation, representing Aalborg University and CLAAUDIA in the DeiC Interactive HPC Consortium.

By making advanced data processing more accessible to researchers from various disciplines, DeiC Interactive HPC helps break down the technical barriers that previously limited access to these resources. With an increasing number of students and new users from diverse backgrounds combined with continuous engagement in user-centred innovation, DeiC Interactive HPC not only supports the academic community but also plays a crucial role in promoting a more inclusive and productive research environment.


For further information and high resolution graphics, contact:
Kristoffer Nielbo, Director of Center for Humanities Computing, Aarhus University, 26832608 kln@cas.au.dk

UCloud offers access to advanced tools such as quantum simulation apps and H100 GPUs as well as applications aimed at data analysis and visualisation.

In data analysis, Python and Jupyter notebooks are particularly prominent, catering to the interactive, ad hoc, and data-centric workflows common in the field. These tools are highly valued for their user-friendliness in handling rapidly changing software environments and offer rich user interfaces, a significant advantage compared to traditional HPC setups, which can be more complex or less flexible.

Furthermore, the integration of tools such as Conda for managing software packages, Jupyter notebooks, Rstudio, Coder, and Dask for parallel computing significantly enhances the usability of HPC resources for interactive and on-demand data processing needs. These tools help bridge the gap between the hardware of complex HPC systems and the user-friendly software environments that data scientists require.

About DeiC Interactive HPC

Use Cases and News

News About the New UI

DeiC Interactive HPC (UCloud) is a successful collaboration between three universities: SDU, AU, and AAU.

Aalborg University, CLAAUDIA, represented by Lars Sørensen

SDU, eScience Center, represented by Professor Claudio Pica

Aarhus University, Center for Humanities Computing, represented by Professor Kristoffer Nielbo

Categories
Interactive HPC Research Supercomputing UCloud

Video use case: HPC enlightens researchers in social sciences and humanities about human behavior

Sociologist Rolf Lyneborg Lund has trained an image AI using DeiCInteractive, which can help us understand how people perceive the concepts of “good” and “bad” neighbourhoods.

Visit deic.dk to view video use case from the 2023 DeiC Conference

Categories
Interactive HPC Research Supercomputing

State-of-the-art GPUs for AI available through DeiC Interactive HPC

AI companies around the world are scrambling to get their hands on the latest and most powerful NVIDIA GPU called H100. The biggest costumers include OpenAI, Microsoft and Google. Now, 16 NVIDIA H100 GPUs have landed at SDU, ready to be integrated into the DeiC Interactive HPC system. With the arrival of 4 servers with 4 H100 GPUs each at SDU, Danish researchers will be able to access the same hardware coveted by some of the biggest tech companies in the world.

Go to story

Image: NVIDIA Hopper H100 GPU. Credit: NVIDIA

Categories
Interactive HPC Research Supercomputing Uncategorized Use case

Utilizing agent-based models in archaeological data   

Supercomputing has long been associated with areas such as physics, engineering, and data science. However, researchers in humanities at Aarhus University are increasingly turning to supercomputing allowing them to delve into unexplored territories and discover new insights.
From analysing historical archives to simulating ancient civilizations to analysing social media data, supercomputing offers unique opportunities to generate insights and advance knowledge in humanities.

In this article series, we highlight three cases with humanities researchers from Aarhus University that illustrate the varied ways in which supercomputing is being used in humanities research.


Iza Romanowska is assistant professor at Aarhus University working at the Aarhus Insitute of Advanced Studies where she studies complex ancient societies.

To overcome the challenges of limited data from these ancient societies, researchers have started utilizing Agent-based model (ABM) sometimes enabled by supercomputing. ABMs are computational models that simulate the behaviour and interactions of individual entities, known as agents, within a specified environment or system. Each agent in the model is typically programmed with a set of rules or algorithms that control its behaviour, decision-making processes, and interactions with other agents and the environment.

ABM is a valuable tool in archaeology that allows us to simulate and analyse the behaviours and interactions of individuals or groups in past societies, and the use of ABM allows comparison of the model against real archaeological data.

Assistant Professor Iza Romanowska

In one of Iza Romanowska’s studies, agent-based modelling (ABM) made it possible for her and her colleagues to explore the Roman economy in the context of long-distance trade, using ceramic tableware to understand the distribution patterns and buying strategies of traders in the Eastern Mediterranean between 200 BC and AD 300.  

The potential of supercomputing in humanities becomes particularly evident when studying such societies with only limited data as experienced by archaeologists and historians. Iza Romanowska explains that the availability of data is limited in her field compared to other disciplines, stating that while social scientists studying more contemporary populations have access to abundant amounts of data such as the number of traders, transactions, and values, “we have none of this information.” Therefore, the use of HPC has been essential for her research.  

ABM as methodological tool necessitates running the simulation many times, and by many, I mean eight hundred thousand times, and that is possible with a laptop… if one plans to be doing their Ph.D. for 500 years. Supercomputing is bigger, faster, better without any qualitative change in terms of the research.

Assistant Professor Iza Romanowska

Using a high-performance computer like the DeiC Interactive HPC system enhances the scalability and speed of ABMs, allowing researchers to gain deeper insights into the behavior and outcomes of complex systems. The DeiC Interactive HPC facility hosts out-of-the-box tools, like NetLogo, for working with ABM. Researchers can also use ABM frameworks for Python or R in one of the many development apps like JupyterLab or Coder.  

Supercomputing and coding as research tools advance humanities research 

While humanities data in general is plentiful and can be analysed effectively, Iza Romanowska finds that there is a gap in understanding the underlying processes that generate the observed patterns, resulting in underdeveloped explanatory frameworks. Her point is that the lack of formal tools for theory building and testing remains a major disciplinary issue. 

“Within humanities including archaeology and history, data analysis is well-established. However, there’s a kind of fundamental disciplinary problem with that we don’t have or use many computational tools for theory building and theory testing. Supercomputing as a tool for the humanities can contribute to fill this gap and strengthen theory building and ultimately it can advance the field of humanities research.”  

Assistant Professor Iza Romanowsk

Iza Romanowska believes that more people in humanities should learn to code to take advantage of the possibilities offered by their data. She suggests that supercomputing can be a natural progression from this. While many humanities researchers may not feel like they need supercomputing, perhaps they are simply not asking questions that could benefit from high-performance computing (HPC). 

I would especially encourage junior researchers in the humanities to embrace supercomputing. It never hurts to acquire a skill, and many of these tools are becoming so easily available that it’s almost a shame to not use them.


You have just read the second of three cases in our series on Interactive HPC usage in humanities.
Through these compelling cases it becomes evident that supercomputing in humanities research is transforming traditional approaches, empowering researchers to uncover new insights and deepen our understanding of the field.  It opens doors to interdisciplinary collaborations and expands the possibilities for data analysis and modelling, ultimately shaping the future of digital humanities. 

Stay tuned for our third case featuring Rebekah Baglini representing her field of linguistics and check out the first case featuring Katrine Frøkjær Baunvig and the case of creating a Grundtvig-artificial intelligence using HPC