DeiC HPC Interactive HPC Supercomputing UCloud

Launch of the DeiC Integration Portal

Since the DeiC HPC services started in November 2020, a consortium of universities (AU, DTU and SDU) has been working hard to finish the ambitious development of the DeiC Integration Portal. The vision of the DeiC Integration Portal is to provide a national solution to access all the DeiC HPC systems and future DeiC services under one common portalAfter two years of development, UCloud has now been expanded with new functionality to integrate with the DeiC HPC providers.

Denmark currently has three national HPC services operated and hosted by different consortia of Danish universities and coordinated by the Danish e-Infrastructure Cooperation (DeiC). All researchers in Denmark can apply for resources on the national HPC services, including the Danish part of the European supercomputer LUMI, either through their universities’ Front Office or via national calls. 

Along with the establishment of the national HPC services, it was also envisioned that researchers should be able to access the DeiC systems via a common national portal. This portal should ideally make it “as easy to use the national HPC centers as AWS-, Azure- and Google cloud service”(from the DeiC call in 2020). The DeiC Board decided to make a call for expression of interests for the development of the DeiC Integration Portal, which at the time was also referred to as Project 5.

In 2020, the consortium of universities consisting of AU, DTU and SDU, with SDU as the coordinating body for the consortium, sent the proposal to base this portal on UCloud. This proposal was accepted by the DeiC Board in 2020.

When we answered the DeiC call in 2020, we understood the potential behind the vision of the DeiC Board. At the time the UCloud software platform was maturing into a full-fledge solution for e-research, and it seemed an ideal starting point for the DeiC Project 5 (DeiC Integration Portal)

Claudio Pica, professor at SDU and coordinator of the winning consortium.

Advantages for the users

For the users, there are many advantages of having a common portal to access the national HPC services. Professor, Kristoffer Nielbo, from the Center for Humanities Computing at Aarhus University, explains:

As a researcher (and an infrastructure provider), a common portal brings us closer to the seamless integration of multiple national HPC systems. Such access simplifies my workflows and saves valuable resources otherwise spent on mentally, and sometimes physically, ‘switching’ between platforms. It also makes transitioning from interactive to batch jobs less ‘scary.’ Finally, the portal reduces resources spent on onboarding new researchers in my lab because they only have to learn how to access HPC through the Integration Portal.”

Professor Kristoffer Nielbo, Center for Humanities Computing, Aarhus University

Kristoffer Nielbo tested the portal doing the project’s pilot phase in Fall 2022, and he was very happy with the result.

I was surprised at how well the portal reproduced the familiar user experience of DeiC Interactive HPC – where UCloud has been used for several years. Even though the mode of running jobs is fundamentally different (although DeiC Interactive HPC can run batch jobs), the project and file management, which are large parts of UCloud, were very similar. I wish more national HPC systems had been available during testing.

Professor Kristoffer Nielbo

A common portal also makes it easier for the DeiC Interactive HPC users to use and transition to other more “traditional” HPC systems, such as the LUMI supercomputer.

Even in my lab, I can see that more researchers that used to use DeiC Interactive HPC are now planning to use DeiC Throughput HPC. Project 5 arrived at the right time for many DeiC Interactive HPC users – we have just started to ‘develop an appetite’ for HPC. That being said, I see the different national HPC systems as complementary, and Project 5 enables more users to benefit from more systems.

Professor Kristoffer Nielbo

Implementation of the design

To better understand how the DeiC Integration Portal has been implemented in UCloud, it may be useful to look at how UCloud used to work. In the figure below, an end-user wants to run an application. Using their laptop, they open UCloud, find the application in the application store and click on the “Start” button. This causes their laptop to send a message to UCloud, containing the user’s command. UCloud then sends a similar message to the “DeiC Interactive HPC” computing resources (in this example the YouGene cluster at SDU).

How UCloud used to work before implementation of the DeiC Integration Portal

In Project 5, the consortium developed a component called the UCloud Integration Module (or UCloud/IM) which sits at the service provider and which is controlled by the service provider. The UCloud/IM communicates with UCloud and exposes the computing resources of the provider. The service providers have full control over what the UCloud/IM can do.

How the system works after implementation og the DeiC Integration Portal and the Integration Module component.

At a technical level, UCloud/IM is plugin-based software. This means that, as a provider, you can choose and adapt the IM to fit your environment. We have packed it full of features for controlling authentication and authorization. It has several different implementations for compute, storage, licenses and more.

Dan Sebastian Thrane, team leader for cloud services at the SDU eScience Center

The UCloud/IM was designed to maintain a high level of IT security and the integrity of the individual service providers.

To use an analogy, without the UCloud/IM, sending a message via the DeiC Integration Portal would (from the service providers’ perspective) be like giving the postman the keys to your house to deliver the mail. Instead the UCloud/IM acts like a “mailbox”, where the postman can leave your mail without entering your house.

Design Principles

It has been important for the consortium behind the DeiC Integration Portal to have a transparent design and an inclusive development process. A DeiC Steering Group, which included representatives from all the universities in Denmark, was formed by the DeiC Board. This steering group has discussed the design of the portal throughout the development period and approved the final result.

It has also been important for the consortium and DeiC to stress that the DeiC Integration Portal does not replace or control any functionality which DeiC service providers have. It simply exposes these functionalities in a secure and user-friendly way to all users with a common interface, acting as a secure message brokering system.

The DeiC Integration Portal initiative aims to facilitate access to remote compute resources through a joint portal with multiple backend HPC resources. These backend service providers are at the same time HPC service providers to their home universities and part of the emerging national HPC infrastructure. This mission duality implies that the resource providers, at all times, should be able to maintain full integrity and local control.

Michael Rasmussen, section leader for Research-IT (RIT) at Technical University of Denmark.

Full integrity and local control has been achieved by following a set of design principles:

  1. Zero trust design
  2. Exclusively users local to the service providers
  3. Configurable integration module with no elevated privileges 
  4. Local validation and authorization control for all actions following the local policies 

‘Never trust, always verify’ (zero trust) has been a guiding principle for the design of the process from initiating a job, submitting the job request to the service provider, queuing and executing the job, and finally reporting back to the portal. Users authenticate with home-institution credentials (via WAYF) on login to the Integration Portal and can from here apply for compute resources. Once the DeiC Front Office of a user’s home-institution approves an application for resources, the local resource provider can authorize access by having a local user account created and associated with the user’s DeiC Integration Portal account.

Michael Rasmussen

If the user does not comply with code-of-conduct, the compute resource provider can disable the user’s connection via the integration module and lock the local user account to prevent re-logins until further notice. This means that only user accounts validated, created and authenticated locally, act on the local resource provider facility, thereby ensuring local integrity and control.

If a DeiC Integration Portal user unknown to the local resource provider facility submits a job, the process of validating and creating the new user account is completely controlled by the resource provider. This ensures that only locally validated users act on the local facility.

Michael Rasmussen
DeiC Integration Portal provides access to the four HPC types that make up the national HPC landscape

Integration with DeiC Large Memory HPC

Since the 19th of December 2022, the first of several planned service providers, the DeiC Large Memory HPC system, was enabled on the DeiC Integration Portal.

DeiC Large Memory HPC is a traditional HPC system with large memory nodes (up to 4TB per node) based on Slurm as the workload manager. This kind of system is historically used primarily by the natural sciences, such as physics and chemistry, for large scale simulations of physical and biological systems via non-interactive batch jobs. As such, this kind of system is very different from the DeiC Interactive HPC platform.

Traditional HPC users from the natural sciences will also benefit from the new integration.

The DeiC Integration Portal provides project management features previously lacking on the system. From the platform, the project PI (or a project administrator) can manage users in the project themselves. Previously they had to write to the user support whenever a user had to be added to the project. Similarly, users are now able to upload their SSH keys directly, instead of sending them via mail.

Martin Lundquist Hansen, team leader for the infrastructure team at the SDU eScience Center

Martin Lundquist Hansen furthermore explains that:

The integration also allows users to manage their files and Slurm jobs directly from the UCloud platform. This is especially important for users less familiar with traditional text based HPC systems, but even for more experienced users this might be convenient in some cases. It is important to emphasize, however, that the DeiC Integration Portal simply provides an additional method for accessing the system, while traditional SSH access is still possible and unchanged.

Martin Lundquist Hansen

Like Kristoffer Nielbo, Martin Lundquist Hansen stresses that the DeiC Integration Portal may help users of the DeiC Interactive HPC system transition to other DeiC HPC systems:

With the new integration, users can consume resources on the DeiC Large Memory HPC system in the same way they are already consuming resources on UCloud. There is of course a difference in the type of applications that conventionally are used on the two types of systems, but they can now be accessed and executed in a uniform way. As users learn to run jobs on the system via the UCloud platform, the transition to accessing the system via SSH might also become easier, due to familiarity with certain aspects of the system.

Martin Lundquist Hansen

The implementation of the DeiC Integration Portal also offers a new avenue for running interactive jobs on traditional HPC clusters, like the DeiC Large Memory HPC system, something that is not typically done on these types of systems. An example, a popular application is JupyterLab, which is a web-based application that allows you to work interactively with languages such as Python and R. Thanks to the DeiC Integration Portal integration these applications can be launched as a Slurm job and the users can then work with the application directly from their browsers.

We are planning to implement more applications of this type in the future, such that the resources are more readily available for non-expert users.

Martin Lundquist Hansen

Currently JupyterLab and RStudio are available for the DeiC Large Memory HPC.

Integration with DTU Sophia

The DTU Sophia HPC cluster, which is part of the DeiC Throughput HPC service, is also available on the DeiC Integration Portal.

The Sophia system is hosted at DTU Campus Risø. The HPC cluster consists of dual processor AMD EPYC nodes fully connected through a 100G Infiniband Fat Tree topology. The full description can be found in the system documentation.

Currently, the main user groups on Sophia are from DTU Wind and DTU Construct. They typically run heavy duty numerical simulations like Computational Fluid Dynamics workloads, using softwares like Ellipsys, OpenFOAM, PETSc, and WRF. Other commonly used applications are AI/Machine Learning, Quantum Chemistry (Density Functional Theory), Monte Carlo and Molecular Dynamics codes. Commercial applications, like ABAQUS, COMSOL, Mathematica, and Matlab are also widely used.

Integration with LUMI/Puhuri

The third planned integration is with the LUMI supercomputer. LUMI has its own project management portal called Puhuri, which is used to create projects on the LUMI supercomputer. The consortium has worked with the Puhuri development team to support the functionality from the DeiC Integration Portal. Due to the scope of the Puhuri portal, this integration will, however, be limited to project management and requests of resources on LUMI. It is not yet possible to run jobs on LUMI directly from the DeiC Integration Portal.

What comes next?

With the DeiC Integration Portal now launched, in the future more DeiC services can be added. The majority of DeiC HPC services are already part of the portal: DeiC Interactive HPC (where hardware is placed both at SDU and AAU), DTU Sophia (part of DeiC Throughput HPC), DeiC Large Memory HPC and LUMI. The missing DeiC HPC services, part of the DeiC Throughput HPC, will be added in the future.

The DeiC Integration Portal will also make it possible to integrate with the upcoming DeiC data management services. A possible integration with DeiC data management services could mean that researchers will be able to use their data across the whole portfolio of DeiC services, for example to analyse data at different DeiC HPC centers.

In collaboration with DeiC, we plan to improve the look and branding of the new DeiC Integration Portal.

Outside of Denmark, the functionality of the new DeiC Integration Portal has already caught the attention of research institutions. This includes e.g. the HALRIC consortium, which recently received 11 million euros to build collaborations between companies, hospitals and universities (press release from Lund University). Within Denmark, there has been a dialogue with Danish Bioimaging Infrastructure (DBI-INFRA) Image Analysis Core Facility, who are also interested in the possibilities offered by the platform.

No doubt, the attention the DeiC Integration Portal has received both nationally and on a European level is an acknowledgement of the skills and competences of the consortium’s developers and the original vision of the DeiC Board from 2020. Surely, this is only the beginning of many future collaborations, which will benefit the research environment in Denmark.

Teaching Workshop

CodeRefinery workshop March 21-23 and 28-30, 2023

Course goals

In this course, you will become familiar with tools and best practices for scientific software development. This course will not teach a programming language, but we teach the tools you need to do programming well and avoid common inefficiency traps. The tools we teach are practically a requirement for any scientist who needs to write code. The main focus is on using Git for efficiently writing and maintaining research software.


Do you identify with any of these below, then this course is for you:

  • You write scripts to process data.
  • You change scripts written by your colleagues.
  • You write code that is used in research by you or others.
  • You wish you could re-run your own code after a few months.
  • You wish you could reproduce your own results better.
  • You wish you could automate your work better.
  • You, or your group, can’t share or reuse code.
  • You overall want to become more efficient at your work, by using the best possible tools.


The workshop will be held on March 21-23 and 28-30, 2023

Go to the CodeRefinery workshop webpage for more information and registration.

About CodeRefinery

CodeRefinery acts as a hub for FAIR (Findable, Accessible, Interoperable, and Reusable) software practices. It currently focuses on the Nordic/Baltic countries, but aims to expand beyond this region. CodeRefinery aims to operate as a community project with support from academic organisations.

CodeRefinery is a project within the Nordic e-Infrastructure Collaboration (NeIC). NeIC is a joint initiative between the Nordic countries, and the NeIC Board based on nominations by the national e-infrastructure provider organisations. These strategic partner organisation are CSC (Finland), SNIC (Sweden), Sigma2 (Norway), DeiC (Denmark), RH Net (Iceland) and ETAIS (Estonia).

call DeiC HPC Supercomputing

H2-2023 Call for national HPC resources is now open

With this call DeiC invites applications from scientific staff and students at the eight Danish universities for allocation of compute resources on the national HPC facilities. The call is open to all scientific areas. Call H2-2023 is the second call for application on compute resources on the national e-resources. The second call is an open call with no specific purpose.

The call is for all HPC facilities. For information about other facilities than DeiC Interactive HPC, please visit DeiC. Resources available specifically for Interactive HPC:

CenterUnitResources (2 year)Resources (2 years)
DeiC Interactive HPC -CPUCPU core/h4.993.0003.995.000
DeiC Interactive HPC – GPUGPU core/h239.300191.000
DeiC Interactive HPC – StorageTB400300
Interactive HPC resources available in the Call

Time Schedule for application and grant

H2-2023 is the second call for national compute resources. The time schedule is as follow:

Publication date5th of January 2023
Deadline for applications5th of March 2023
Assessment of compliance with formal requirements is carried out17th of March 2023
Technical and scientific assessment from the e-resource committee is carried out8 of May 2023
DeiC Board approves the grant recommendation from the e-resource committeeLate May
Applicants receives letter of grant or letter of rejection15 th of June 2023
Front office and HPC Centers are notified on the grant distribution15 th of June 2023
Allocated resources available from1st of July

For more information on this call, visit DeiC: Apply for HPC Resources

DeiC HPC Interactive HPC Supercomputing UCloud

Interactive HPC reaches 5000 users

On Saturday, December 3rd, DeiC Interactive HPC reached 5000 users.

This means that in the past three months, DeiC Interactive HPC has had 1000 new users.

We’re very proud of the platform’s succes. Since the National HPC services began in November 2020, it has been evident that DeiC Interactive HPC via the platform UCloud has diversified the use of HPC across different fields of research by lowering the threshold for non-experts.

The fast growth of users also reflects that DeiC Interactive HPC has proven extremely useful for teaching and is now increasingly used by universities outside of the consortium of universities running the service.

This story was originally posted on and can be read in full via

HPC Interactive HPC Research Supercomputing UCloud

UCloud as a complementary HPC tool within theoretical particle physics

Though supercomputers form the key basis of his research, UCloud has been a valuable, complementary tool for Tobias and his colleagues and will most likely continue to be so in future work as well.

Post.doc. Tobias Tsang works within the broader research field of theoretical particle physics. As part of the Centre for Cosmology and Particle Physics Phenomenology (CP3-Origins) at University of Southern Denmark, his research more specifically concerns quantum field theory and quantum chromodynamics (QCD), i.a. how fundamental particles, protons and neutrons, interact with each other:

My research aims to provide high precision predictions based solely on the theory of the Standard Model – the best-known understanding of the interaction of fundamental (i.e. not containing ‘smaller constituents’) particles. This is done via very large-scale numerical simulations using the most powerful supercomputers around the world.

Post.doc. Tobias Tsang, Centre for Cosmology and Particle Physics Phenomenology (CP3-Origins) at University of Southern Denmark

Experience and achievements

More traditional mathematical methods that can be written down with pen and paper do not apply for research on quantum chromodynamics. As such, Tobias’ research relies on a method called ‘Monte Carlo’ which is applied to compute statistical field theories of simple particle systems. Though this type of research is done using very large supercomputers, Tobias has recurrently applied UCloud for exploratory studies of smaller volumes of data:

When doing large scale simulations, we sometimes do it on something called ‘10,000 cores in parallel’, and clearly this is not something we can easily do on a resource like UCloud. But for the small exploratory studies, UCloud is a nice resource in the sense that it is available; you don’t have to sit here on a hot day and burn your laptop to death – you can send it to UCloud and run it there. I think this is kind of the point where I have used UCloud the most; for small exploratory studies and some of the projects that don’t need a huge amount of computer time but still a significant portion.

Post.doc. Tobias Tsang

Though UCloud has served as a supplemental rather than a key tool in Tobias’ work together with the CP3-Origins research centre, he describes it as a nice complement to other HPC resources:

“I don’t think UCloud will ever be the only resource we use. But this is also the design of it; UCloud is not meant to be a huge machine, it is meant to be an available resource that is easy to use and that gives you a playground to set up things really from scratch where you can test things out and run smaller jobs and analyses. In that sense, it is quite complementary to a lot of the things we normally work with. For exploratory studies and for code testing, UCloud will definitely remain very useful.”

Post.doc. Tobias Tsang

At one specific project done at SDU as a collaboration between CP3 and IMADA (Institute of Mathematics and Data Science) a few years back, the vast majority of samples were generated on UCloud, and a significant amount of data production and measurements were also carried out on there [1]. UCloud needs, however, to be considered a part of a whole, according to Tobias:

“It is not that one particular machine made it possible; we would otherwise have found another machine to run it on. But UCloud provided us with a nice set up where we could just use local resources without having to go through big grant applications to get computer time.”

Post.doc. Tobias Tsang

Pros and cons

In terms of time optimization, UCloud has also been a game changer for Tobias:

One of the nice things about UCloud compared to other machines is the wall clock time: quite often, for larger clusters, depending on the cluster though, you are very much restricted by the queue policies. So, there are some clusters where you have a maximum run time of 4 hours, and if you happen to run a small job that is longer than this, then you can’t – you have to always tailor your job to fit exactly and to make the maximum use of it. On UCloud you have a 200-hour wall clock. This is very helpful as for a lot of these things that have to run sequentially, you might not need a huge resource, you just need to have a long enough time span to actually do it.

Post.doc. Tobias Tsang

Though UCloud slowed the work process down a bit in the beginning as everything had to be installed and set up, this downside was quickly resolved and overshadowed by the benefits: 

“Once you get used to it, you can kind of equalize the work process to what you would have on a cluster where everything is just readily installed.”

Post.doc. Tobias Tsang

Despite pros and cons, Tobias describes UCloud as a flexible system:

The fact that UCloud is really just a virtual machine has both positive and negative sides. The positive side is that you are really free to do whatever you want to do; you can install everything and you don’t have any restrictions that you would have on larger clusters where you can’t easily install software, or you can’t install it into the parts where you want to install it. On larger clusters, you are typically limited by the compilers that are already there. So, from that point of view, UCloud, at least to me, seems like a more flexible system. The downside is that you have to install everything; you can’t just quickly run something, you kind of have to constantly install everything from scratch.

Post.doc. Tobias Tsang

Last but not least, Tobias stresses the interaction with the UCloud front office as a major benefit that has helped the research group significantly, especially compared to other clusters with a much longer response time:

One of the nice things with UCloud as a general system is that every time something didn’t work, we got a really quick email back. Any questions we raised were answered quickly, so it was never something that kept us stuck for weeks or months – typically things were resolved in a very timely time scale. And things that we actively suggested as nice features or things that we thought were missing on UCloud were likewise addressed.

Post.doc. Tobias Tsang

[1]  Della Morte, Jaeger, Sannino, Tsang and Ziegler, “One Flavour QCD as an analogue computer for SUSY”, PoS LATTICE2021 (2022) 225,

Interactive HPC Research UCloud

National Health Data Science Sandbox for Training and Research

UCloud is not just an ideal platform for the individual researcher who wants interactive access to HPC resources or an easy way to collaborate with national or international partners. It is also highly suitable for teaching. Jennifer Bartell and Samuele Soraggi, who are both working on the project National Health Data Science Sandbox for Training and Research, share their experiences with using UCloud.

National “sandbox” platform

The growing amounts of data in all research fields offer researchers new opportunities and possibilities for scientific breakthrough. In the case of health science, the use of large amounts of data has great potential to improve our health care – it can e.g. expand our ability to understand and diagnose diseases. One of the constraints of using health data is that many datasets (e.g. person-specific health records or genomics data) are sensitive from a patient privacy perspective and governed by strict access and usage guidelines. This can be a major challenge in particular for students or researchers who are just learning best practices in handling health data while also developing data science skills.

Go to SDU eScience for full story

DeiC HPC Interactive HPC Supercomputing UCloud UCloud status Uncategorized

DeiC Interactive HPC reaches 4.000 users on UCloud

We’re approaching the end of the second year with DeiC Interactive HPC – and there are now 4000 users on UCloud!

During the first year with DeiC Interactive HPC, UCloud reached more than 2000 users. We’re glad that the interest in the platform has continued to grow throughout the second year.

Go to SDU eScience for full story

HPC Interactive HPC Teaching UCloud

Teaching Humanities in UCloud

UCloud has been a game changer for Assistant Professor of Cognitive Science and Humanities Computing (Aarhus University), Ross Deans Kristensen-McLachlan, teaching within the crossroads of cultural studies and data science.

In short, the benefits of UCloud within teaching narrows down to a much more trouble-free teaching process free of unnecessary technical issues, allowing teachers as well as students to focus on the substance of their work.

Benefits when teaching in UCloud

One of the major benefits is the computational resources available in terms of having more computing power, allowing students to focus on state-of-the-art work.

Assistant Professor Ross Deans Kristensen-McLachlan

Ross has been teaching two elective Cultural Data Science bachelor courses as well as a master’s level course on Natural Language Processing (NLP) for students of Cognitive Science. A clear before and after characterises the two elective courses, formerly run on a local server: as more than 25 students typically had to have access to the server, it naturally required a lot of energy and time. As a result, actual time to do state-of-the-art work was typically limited but with UCloud this kind of downtime has been reduced significantly. Barriers that could potentially make students new to computational methods loose interest in the field have therefore also been reduced.

Another major benefit, according to Ross, is that UCloud allows all students to work from the same starting point and reduces possible imbalances between students with brand new computers and students with older computer models:

One thing about UCloud, that I actually think is quite important, is that it kind of democratises access to resources.

Assistant Professor Ross Deans Kristensen-McLachlan

In terms of teaching, several palpable benefits allow teachers and students alike to concentrate on the substantial content of the respective courses. Some challenges do, however, arise in class, though these are typically rather insignificant such as some minor issues when integrating with GitHub.

UCloud and the humanities

When teaching the elective courses on Cultural Data Science, Ross has encountered humanities students with no background within computational methods whatsoever. This, however, turned out to be an advantage as the students were typically open and able to adapt quickly:

Because UCloud has eliminated a lot of former technical obstacles and barriers, students can focus on learning good programming practices and the results of their research. It allows us to focus on the task at hand. The students don’t have to know how the backend works; they don’t have to be computer scientists – they are humanities students and should be able to think about humanities objects (texts, visuals etc.) using computational methods.

Assistant Professor Ross Deans Kristensen-McLachlan

As such, UCloud is “a means to an end”, Ross emphasises. Though computational background knowledge is of course far from irrelevant, the objective for the Cultural Data Science courses has been to educate the students to think critically when working with computational methods:

We are not just looking at data science methods and applying them uncritically. We try to use the students’ main expertise and encourage them to apply their subject knowledge to think critically about their results when working with computational methods. Determining the notion ‘genre’ from a classification model eg. urges the students to think critically about the notion itself – is it even something we can determine from text alone?

Assistant Professor Ross Deans Kristensen-McLachlan

Overall, the students following Ross’ courses have been extremely positive about UCloud, even though some were sceptics to begin with. Two kinds of feedback characterised the reception of UCloud from the students in general: one group fully integrated with UCloud from the start, others came to accept it as a necessary (useful) tool.

Collaborate teaching resources

Among teachers from the department of Linguistics, Cognitive Science, and Semiotics at Aarhus University, UCloud has furthermore improved the internal coherence across the department for the benefit of both students and teachers. As most teachers have moved all their material on to UCloud, students now avoid using one set of tools for one course and one set of tools for another course.

Besides the many teaching-related benefits to be gained from UCloud, Ross further emphasises the ongoing dialogue between users of UCloud and the team who maintain it:

They are very responsive to suggestions. Over the past year it’s (UCloud) become even more fully featured in terms of what you can do with it, and I don’t see that stopping any time soon.

Assistant Professor Ross Deans Kristensen-McLachlan

One potential improvement of UCloud, Ross suggests, could be the implementation of some sort of outreach program in order to get even more people to gain from the benefits:

UCloud gets rid of all the annoying things. As far as I can see there are only benefits – the minor issues are vastly outnumbered by the benefits.

Assistant Professor Ross Deans Kristensen-McLachlan
HPC Interactive HPC Supercomputing UCloud

New UCloud release

Since we discontinued our support for mounting your local folders onto UCloud using WebDAV, have we been in search of a way to allow the users of UCloud to work with their files locally without having to re-upload them to UCloud after every change. We are happy to announce that we now have a new solution that gives the possibility to synchronize your local files with your UCloud file storage.

Go to SDU eScience for more information on the release

DeiC HPC Supercomputing UCloud

Call for applications for National HPC resources

As a researcher or Ph.d. Student at a Danish university you can now apply for resources on the national HPC centers, including the Danish part of EuroHPC LUMI.

The second call for applications for regular access to resources on the national HPC centers is now open. This includes applications for the Danish part of EuroHPC LUMI. The call is open for applications from all research areas.

As part of the use of the national e-infrastructures DeiC issues calls for applications on the use of the national resources. The projects are granted resources after application and on basis of assessment of research quality and technical feasibility.

The applications are evaluated by the appointed e-ressource committe and the grants are approved by the DeiC board.

Deadline for applications are 4th of October 2022 midnight, and the resources will be available for use from 1. January 2023.

For more information visit DeiC