Categories
DeiC HPC Interactive HPC Supercomputing UCloud UCloud status

New milestone as DeiC Interactive HPC reaches 6,000 users

Recently, the DeiC Interactive HPC consortium (which consists of Aarhus University, Aalborg University and the University of Southern Denmark) posted a news story about the user overload on the service. This is an issue arising from the very positive fact that the popularity of the DeiC Interactive HPC service is increasing, but it also has the unfortunate effect that some users are now experiencing longer waiting time on the machines than usual.

Needless to say, however, the DeiC Interactive HPC consortium is immensely proud of the success of the service – and now it can also announce that the UCloud platform, which is used to provide the DeiC Interactive HPC service, has passed 6,000 users. So many users on a service which has only been operational for 2,5 years is a great achievement.

“Overall, we’re seeing an increase in the number of new users at nearly all the 8 universities in Denmark. One significant factor for the past few months is that the number of users from Copenhagen University, which has not been using the facility extensively so far, is now increasing at a steady pace. This is both great news, but also a warning sign for us, as KU is a big university and we will need to be prepared to accommodate even more users in the future.”

Prof. Claudio Pica, director of the SDU eScience Center and DeiC Interactive HPC Consortium representative

Another factor, which causes significant spikes in the number of new users at the start of every semester, is the number of students who log on the service because the UCloud software is used as part of their courses.

The consortium is working hard to add more hardware to the DeiC Interactive HPC service to alleviate the current periods of overload. In the meantime, you can find a number of tips for how to avoid overload issues here.

This update was originally posted by the SDU eScience Center.

Categories
DeiC Interactive HPC UCloud UCloud status

The cost of success – user overload on DeiC Interactive HPC

 

While there was never any doubt that DeiC Interactive HPC would be a success, the popularity of the HPC facility has taken the Interactive HPC consortium a little by surprise. The two-year old system reached no less than 5000 users back in December, and while every milestone is celebrated the rapid success also comes with a price. 

We’re seeing an average rise in users of 1000 each quarter and we’re very proud of the success. However, with an average utilisation of 135% of resources for containerised applications, we’re also experiencing issues with user overload recurring more and more frequently as more users join.

DeiC Interactive HPC Consortium representative, Professor Kristoffer Nielbo, Center for Humanities Computing, Aarhus University

Additional hardware would solve most of the issues but adding more hardware is a time-consuming process, and in the meantime the consortium behind Interactive HPC is working on other solutions to ensure the best user experience. 

We are currently working on making operational status available to users to allow them to see when user overload is causing issues and plan their work differently accordingly. The part of Interactive HPC running on SDU already has a solution underway, and the part of the system located at AAU will follow as soon as possible.

Professor Kristoffer Nielbo

However, users can also actively do things to avoid the user overload issues. The DeiC Interactive HPC consortium recommends that; Small users should make sure they only use the resources they need; medium users are asked to consider whether their work could be done on other HPC systems; and large users should apply for resources via the national calls.  

Applying for national resources may not fix the problem right now but by doing so researchers indicate that there is a need for additional hardware for Interactive HPC, and this can help speed up the expansion process.

Professor Kristoffer Nielbo

The consortium also recommends using the new DeiC Integration portal when it makes sense. The portal integrates multiple national HPC systems and allows for users to seamlessly shift to other facilities clearing up space for users whose only option is Interactive HPC.  

The consortium will continue work on solving any issues and ensure that necessary resources are available because there’s no doubt that interactive HPC is here to stay as a favourite HPC resource for researchers.

Categories
DeiC HPC Interactive HPC Supercomputing UCloud

Launch of the DeiC Integration Portal

Since the DeiC HPC services started in November 2020, a consortium of universities (AU, DTU and SDU) has been working hard to finish the ambitious development of the DeiC Integration Portal. The vision of the DeiC Integration Portal is to provide a national solution to access all the DeiC HPC systems and future DeiC services under one common portalAfter two years of development, UCloud has now been expanded with new functionality to integrate with the DeiC HPC providers.

Denmark currently has three national HPC services operated and hosted by different consortia of Danish universities and coordinated by the Danish e-Infrastructure Cooperation (DeiC). All researchers in Denmark can apply for resources on the national HPC services, including the Danish part of the European supercomputer LUMI, either through their universities’ Front Office or via national calls. 

Along with the establishment of the national HPC services, it was also envisioned that researchers should be able to access the DeiC systems via a common national portal. This portal should ideally make it “as easy to use the national HPC centers as AWS-, Azure- and Google cloud service”(from the DeiC call in 2020). The DeiC Board decided to make a call for expression of interests for the development of the DeiC Integration Portal, which at the time was also referred to as Project 5.

In 2020, the consortium of universities consisting of AU, DTU and SDU, with SDU as the coordinating body for the consortium, sent the proposal to base this portal on UCloud. This proposal was accepted by the DeiC Board in 2020.

When we answered the DeiC call in 2020, we understood the potential behind the vision of the DeiC Board. At the time the UCloud software platform was maturing into a full-fledge solution for e-research, and it seemed an ideal starting point for the DeiC Project 5 (DeiC Integration Portal)

Claudio Pica, professor at SDU and coordinator of the winning consortium.

Advantages for the users

For the users, there are many advantages of having a common portal to access the national HPC services. Professor, Kristoffer Nielbo, from the Center for Humanities Computing at Aarhus University, explains:

As a researcher (and an infrastructure provider), a common portal brings us closer to the seamless integration of multiple national HPC systems. Such access simplifies my workflows and saves valuable resources otherwise spent on mentally, and sometimes physically, ‘switching’ between platforms. It also makes transitioning from interactive to batch jobs less ‘scary.’ Finally, the portal reduces resources spent on onboarding new researchers in my lab because they only have to learn how to access HPC through the Integration Portal.”

Professor Kristoffer Nielbo, Center for Humanities Computing, Aarhus University

Kristoffer Nielbo tested the portal doing the project’s pilot phase in Fall 2022, and he was very happy with the result.

I was surprised at how well the portal reproduced the familiar user experience of DeiC Interactive HPC – where UCloud has been used for several years. Even though the mode of running jobs is fundamentally different (although DeiC Interactive HPC can run batch jobs), the project and file management, which are large parts of UCloud, were very similar. I wish more national HPC systems had been available during testing.

Professor Kristoffer Nielbo


A common portal also makes it easier for the DeiC Interactive HPC users to use and transition to other more “traditional” HPC systems, such as the LUMI supercomputer.


Even in my lab, I can see that more researchers that used to use DeiC Interactive HPC are now planning to use DeiC Throughput HPC. Project 5 arrived at the right time for many DeiC Interactive HPC users – we have just started to ‘develop an appetite’ for HPC. That being said, I see the different national HPC systems as complementary, and Project 5 enables more users to benefit from more systems.

Professor Kristoffer Nielbo

Implementation of the design

To better understand how the DeiC Integration Portal has been implemented in UCloud, it may be useful to look at how UCloud used to work. In the figure below, an end-user wants to run an application. Using their laptop, they open UCloud, find the application in the application store and click on the “Start” button. This causes their laptop to send a message to UCloud, containing the user’s command. UCloud then sends a similar message to the “DeiC Interactive HPC” computing resources (in this example the YouGene cluster at SDU).

How UCloud used to work before implementation of the DeiC Integration Portal

In Project 5, the consortium developed a component called the UCloud Integration Module (or UCloud/IM) which sits at the service provider and which is controlled by the service provider. The UCloud/IM communicates with UCloud and exposes the computing resources of the provider. The service providers have full control over what the UCloud/IM can do.

How the system works after implementation og the DeiC Integration Portal and the Integration Module component.

At a technical level, UCloud/IM is plugin-based software. This means that, as a provider, you can choose and adapt the IM to fit your environment. We have packed it full of features for controlling authentication and authorization. It has several different implementations for compute, storage, licenses and more.

Dan Sebastian Thrane, team leader for cloud services at the SDU eScience Center

The UCloud/IM was designed to maintain a high level of IT security and the integrity of the individual service providers.

To use an analogy, without the UCloud/IM, sending a message via the DeiC Integration Portal would (from the service providers’ perspective) be like giving the postman the keys to your house to deliver the mail. Instead the UCloud/IM acts like a “mailbox”, where the postman can leave your mail without entering your house.

Design Principles

It has been important for the consortium behind the DeiC Integration Portal to have a transparent design and an inclusive development process. A DeiC Steering Group, which included representatives from all the universities in Denmark, was formed by the DeiC Board. This steering group has discussed the design of the portal throughout the development period and approved the final result.

It has also been important for the consortium and DeiC to stress that the DeiC Integration Portal does not replace or control any functionality which DeiC service providers have. It simply exposes these functionalities in a secure and user-friendly way to all users with a common interface, acting as a secure message brokering system.

The DeiC Integration Portal initiative aims to facilitate access to remote compute resources through a joint portal with multiple backend HPC resources. These backend service providers are at the same time HPC service providers to their home universities and part of the emerging national HPC infrastructure. This mission duality implies that the resource providers, at all times, should be able to maintain full integrity and local control.

Michael Rasmussen, section leader for Research-IT (RIT) at Technical University of Denmark.

Full integrity and local control has been achieved by following a set of design principles:

  1. Zero trust design
  2. Exclusively users local to the service providers
  3. Configurable integration module with no elevated privileges 
  4. Local validation and authorization control for all actions following the local policies 

‘Never trust, always verify’ (zero trust) has been a guiding principle for the design of the process from initiating a job, submitting the job request to the service provider, queuing and executing the job, and finally reporting back to the portal. Users authenticate with home-institution credentials (via WAYF) on login to the Integration Portal and can from here apply for compute resources. Once the DeiC Front Office of a user’s home-institution approves an application for resources, the local resource provider can authorize access by having a local user account created and associated with the user’s DeiC Integration Portal account.

Michael Rasmussen

If the user does not comply with code-of-conduct, the compute resource provider can disable the user’s connection via the integration module and lock the local user account to prevent re-logins until further notice. This means that only user accounts validated, created and authenticated locally, act on the local resource provider facility, thereby ensuring local integrity and control.

If a DeiC Integration Portal user unknown to the local resource provider facility submits a job, the process of validating and creating the new user account is completely controlled by the resource provider. This ensures that only locally validated users act on the local facility.

Michael Rasmussen
DeiC Integration Portal provides access to the four HPC types that make up the national HPC landscape

Integration with DeiC Large Memory HPC

Since the 19th of December 2022, the first of several planned service providers, the DeiC Large Memory HPC system, was enabled on the DeiC Integration Portal.

DeiC Large Memory HPC is a traditional HPC system with large memory nodes (up to 4TB per node) based on Slurm as the workload manager. This kind of system is historically used primarily by the natural sciences, such as physics and chemistry, for large scale simulations of physical and biological systems via non-interactive batch jobs. As such, this kind of system is very different from the DeiC Interactive HPC platform.

Traditional HPC users from the natural sciences will also benefit from the new integration.

The DeiC Integration Portal provides project management features previously lacking on the system. From the platform, the project PI (or a project administrator) can manage users in the project themselves. Previously they had to write to the user support whenever a user had to be added to the project. Similarly, users are now able to upload their SSH keys directly, instead of sending them via mail.

Martin Lundquist Hansen, team leader for the infrastructure team at the SDU eScience Center

Martin Lundquist Hansen furthermore explains that:

The integration also allows users to manage their files and Slurm jobs directly from the UCloud platform. This is especially important for users less familiar with traditional text based HPC systems, but even for more experienced users this might be convenient in some cases. It is important to emphasize, however, that the DeiC Integration Portal simply provides an additional method for accessing the system, while traditional SSH access is still possible and unchanged.

Martin Lundquist Hansen

Like Kristoffer Nielbo, Martin Lundquist Hansen stresses that the DeiC Integration Portal may help users of the DeiC Interactive HPC system transition to other DeiC HPC systems:

With the new integration, users can consume resources on the DeiC Large Memory HPC system in the same way they are already consuming resources on UCloud. There is of course a difference in the type of applications that conventionally are used on the two types of systems, but they can now be accessed and executed in a uniform way. As users learn to run jobs on the system via the UCloud platform, the transition to accessing the system via SSH might also become easier, due to familiarity with certain aspects of the system.

Martin Lundquist Hansen

The implementation of the DeiC Integration Portal also offers a new avenue for running interactive jobs on traditional HPC clusters, like the DeiC Large Memory HPC system, something that is not typically done on these types of systems. An example, a popular application is JupyterLab, which is a web-based application that allows you to work interactively with languages such as Python and R. Thanks to the DeiC Integration Portal integration these applications can be launched as a Slurm job and the users can then work with the application directly from their browsers.

We are planning to implement more applications of this type in the future, such that the resources are more readily available for non-expert users.

Martin Lundquist Hansen

Currently JupyterLab and RStudio are available for the DeiC Large Memory HPC.

Integration with DTU Sophia

The DTU Sophia HPC cluster, which is part of the DeiC Throughput HPC service, is also available on the DeiC Integration Portal.

The Sophia system is hosted at DTU Campus Risø. The HPC cluster consists of dual processor AMD EPYC nodes fully connected through a 100G Infiniband Fat Tree topology. The full description can be found in the system documentation.

Currently, the main user groups on Sophia are from DTU Wind and DTU Construct. They typically run heavy duty numerical simulations like Computational Fluid Dynamics workloads, using softwares like Ellipsys, OpenFOAM, PETSc, and WRF. Other commonly used applications are AI/Machine Learning, Quantum Chemistry (Density Functional Theory), Monte Carlo and Molecular Dynamics codes. Commercial applications, like ABAQUS, COMSOL, Mathematica, and Matlab are also widely used.

Integration with LUMI/Puhuri

The third planned integration is with the LUMI supercomputer. LUMI has its own project management portal called Puhuri, which is used to create projects on the LUMI supercomputer. The consortium has worked with the Puhuri development team to support the functionality from the DeiC Integration Portal. Due to the scope of the Puhuri portal, this integration will, however, be limited to project management and requests of resources on LUMI. It is not yet possible to run jobs on LUMI directly from the DeiC Integration Portal.

What comes next?

With the DeiC Integration Portal now launched, in the future more DeiC services can be added. The majority of DeiC HPC services are already part of the portal: DeiC Interactive HPC (where hardware is placed both at SDU and AAU), DTU Sophia (part of DeiC Throughput HPC), DeiC Large Memory HPC and LUMI. The missing DeiC HPC services, part of the DeiC Throughput HPC, will be added in the future.

The DeiC Integration Portal will also make it possible to integrate with the upcoming DeiC data management services. A possible integration with DeiC data management services could mean that researchers will be able to use their data across the whole portfolio of DeiC services, for example to analyse data at different DeiC HPC centers.

In collaboration with DeiC, we plan to improve the look and branding of the new DeiC Integration Portal.

Outside of Denmark, the functionality of the new DeiC Integration Portal has already caught the attention of research institutions. This includes e.g. the HALRIC consortium, which recently received 11 million euros to build collaborations between companies, hospitals and universities (press release from Lund University). Within Denmark, there has been a dialogue with Danish Bioimaging Infrastructure (DBI-INFRA) Image Analysis Core Facility, who are also interested in the possibilities offered by the platform.

No doubt, the attention the DeiC Integration Portal has received both nationally and on a European level is an acknowledgement of the skills and competences of the consortium’s developers and the original vision of the DeiC Board from 2020. Surely, this is only the beginning of many future collaborations, which will benefit the research environment in Denmark.

Categories
DeiC HPC Interactive HPC Supercomputing UCloud

Interactive HPC reaches 5000 users

On Saturday, December 3rd, DeiC Interactive HPC reached 5000 users.

This means that in the past three months, DeiC Interactive HPC has had 1000 new users.

We’re very proud of the platform’s succes. Since the National HPC services began in November 2020, it has been evident that DeiC Interactive HPC via the platform UCloud has diversified the use of HPC across different fields of research by lowering the threshold for non-experts.

The fast growth of users also reflects that DeiC Interactive HPC has proven extremely useful for teaching and is now increasingly used by universities outside of the consortium of universities running the service.

This story was originally posted on and can be read in full via escience.sdu.dk

Categories
HPC Interactive HPC Research Supercomputing UCloud

UCloud as a complementary HPC tool within theoretical particle physics

Though supercomputers form the key basis of his research, UCloud has been a valuable, complementary tool for Tobias and his colleagues and will most likely continue to be so in future work as well.

Post.doc. Tobias Tsang works within the broader research field of theoretical particle physics. As part of the Centre for Cosmology and Particle Physics Phenomenology (CP3-Origins) at University of Southern Denmark, his research more specifically concerns quantum field theory and quantum chromodynamics (QCD), i.a. how fundamental particles, protons and neutrons, interact with each other:

My research aims to provide high precision predictions based solely on the theory of the Standard Model – the best-known understanding of the interaction of fundamental (i.e. not containing ‘smaller constituents’) particles. This is done via very large-scale numerical simulations using the most powerful supercomputers around the world.

Post.doc. Tobias Tsang, Centre for Cosmology and Particle Physics Phenomenology (CP3-Origins) at University of Southern Denmark

Experience and achievements

More traditional mathematical methods that can be written down with pen and paper do not apply for research on quantum chromodynamics. As such, Tobias’ research relies on a method called ‘Monte Carlo’ which is applied to compute statistical field theories of simple particle systems. Though this type of research is done using very large supercomputers, Tobias has recurrently applied UCloud for exploratory studies of smaller volumes of data:

When doing large scale simulations, we sometimes do it on something called ‘10,000 cores in parallel’, and clearly this is not something we can easily do on a resource like UCloud. But for the small exploratory studies, UCloud is a nice resource in the sense that it is available; you don’t have to sit here on a hot day and burn your laptop to death – you can send it to UCloud and run it there. I think this is kind of the point where I have used UCloud the most; for small exploratory studies and some of the projects that don’t need a huge amount of computer time but still a significant portion.

Post.doc. Tobias Tsang

Though UCloud has served as a supplemental rather than a key tool in Tobias’ work together with the CP3-Origins research centre, he describes it as a nice complement to other HPC resources:

“I don’t think UCloud will ever be the only resource we use. But this is also the design of it; UCloud is not meant to be a huge machine, it is meant to be an available resource that is easy to use and that gives you a playground to set up things really from scratch where you can test things out and run smaller jobs and analyses. In that sense, it is quite complementary to a lot of the things we normally work with. For exploratory studies and for code testing, UCloud will definitely remain very useful.”

Post.doc. Tobias Tsang

At one specific project done at SDU as a collaboration between CP3 and IMADA (Institute of Mathematics and Data Science) a few years back, the vast majority of samples were generated on UCloud, and a significant amount of data production and measurements were also carried out on there [1]. UCloud needs, however, to be considered a part of a whole, according to Tobias:

“It is not that one particular machine made it possible; we would otherwise have found another machine to run it on. But UCloud provided us with a nice set up where we could just use local resources without having to go through big grant applications to get computer time.”

Post.doc. Tobias Tsang

Pros and cons

In terms of time optimization, UCloud has also been a game changer for Tobias:

One of the nice things about UCloud compared to other machines is the wall clock time: quite often, for larger clusters, depending on the cluster though, you are very much restricted by the queue policies. So, there are some clusters where you have a maximum run time of 4 hours, and if you happen to run a small job that is longer than this, then you can’t – you have to always tailor your job to fit exactly and to make the maximum use of it. On UCloud you have a 200-hour wall clock. This is very helpful as for a lot of these things that have to run sequentially, you might not need a huge resource, you just need to have a long enough time span to actually do it.

Post.doc. Tobias Tsang

Though UCloud slowed the work process down a bit in the beginning as everything had to be installed and set up, this downside was quickly resolved and overshadowed by the benefits: 

“Once you get used to it, you can kind of equalize the work process to what you would have on a cluster where everything is just readily installed.”

Post.doc. Tobias Tsang

Despite pros and cons, Tobias describes UCloud as a flexible system:

The fact that UCloud is really just a virtual machine has both positive and negative sides. The positive side is that you are really free to do whatever you want to do; you can install everything and you don’t have any restrictions that you would have on larger clusters where you can’t easily install software, or you can’t install it into the parts where you want to install it. On larger clusters, you are typically limited by the compilers that are already there. So, from that point of view, UCloud, at least to me, seems like a more flexible system. The downside is that you have to install everything; you can’t just quickly run something, you kind of have to constantly install everything from scratch.

Post.doc. Tobias Tsang

Last but not least, Tobias stresses the interaction with the UCloud front office as a major benefit that has helped the research group significantly, especially compared to other clusters with a much longer response time:

One of the nice things with UCloud as a general system is that every time something didn’t work, we got a really quick email back. Any questions we raised were answered quickly, so it was never something that kept us stuck for weeks or months – typically things were resolved in a very timely time scale. And things that we actively suggested as nice features or things that we thought were missing on UCloud were likewise addressed.

Post.doc. Tobias Tsang

[1]  Della Morte, Jaeger, Sannino, Tsang and Ziegler, “One Flavour QCD as an analogue computer for SUSY”, PoS LATTICE2021 (2022) 225, https://doi.org/10.22323/1.396.0225

Categories
Interactive HPC Research UCloud

National Health Data Science Sandbox for Training and Research

UCloud is not just an ideal platform for the individual researcher who wants interactive access to HPC resources or an easy way to collaborate with national or international partners. It is also highly suitable for teaching. Jennifer Bartell and Samuele Soraggi, who are both working on the project National Health Data Science Sandbox for Training and Research, share their experiences with using UCloud.

National “sandbox” platform

The growing amounts of data in all research fields offer researchers new opportunities and possibilities for scientific breakthrough. In the case of health science, the use of large amounts of data has great potential to improve our health care – it can e.g. expand our ability to understand and diagnose diseases. One of the constraints of using health data is that many datasets (e.g. person-specific health records or genomics data) are sensitive from a patient privacy perspective and governed by strict access and usage guidelines. This can be a major challenge in particular for students or researchers who are just learning best practices in handling health data while also developing data science skills.

Go to SDU eScience for full story

Categories
DeiC HPC Interactive HPC Supercomputing UCloud UCloud status Uncategorized

DeiC Interactive HPC reaches 4.000 users on UCloud

We’re approaching the end of the second year with DeiC Interactive HPC – and there are now 4000 users on UCloud!

During the first year with DeiC Interactive HPC, UCloud reached more than 2000 users. We’re glad that the interest in the platform has continued to grow throughout the second year.

Go to SDU eScience for full story

Categories
HPC Interactive HPC Teaching UCloud

Teaching Humanities in UCloud

UCloud has been a game changer for Assistant Professor of Cognitive Science and Humanities Computing (Aarhus University), Ross Deans Kristensen-McLachlan, teaching within the crossroads of cultural studies and data science.

In short, the benefits of UCloud within teaching narrows down to a much more trouble-free teaching process free of unnecessary technical issues, allowing teachers as well as students to focus on the substance of their work.

Benefits when teaching in UCloud

One of the major benefits is the computational resources available in terms of having more computing power, allowing students to focus on state-of-the-art work.

Assistant Professor Ross Deans Kristensen-McLachlan

Ross has been teaching two elective Cultural Data Science bachelor courses as well as a master’s level course on Natural Language Processing (NLP) for students of Cognitive Science. A clear before and after characterises the two elective courses, formerly run on a local server: as more than 25 students typically had to have access to the server, it naturally required a lot of energy and time. As a result, actual time to do state-of-the-art work was typically limited but with UCloud this kind of downtime has been reduced significantly. Barriers that could potentially make students new to computational methods loose interest in the field have therefore also been reduced.

Another major benefit, according to Ross, is that UCloud allows all students to work from the same starting point and reduces possible imbalances between students with brand new computers and students with older computer models:

One thing about UCloud, that I actually think is quite important, is that it kind of democratises access to resources.

Assistant Professor Ross Deans Kristensen-McLachlan

In terms of teaching, several palpable benefits allow teachers and students alike to concentrate on the substantial content of the respective courses. Some challenges do, however, arise in class, though these are typically rather insignificant such as some minor issues when integrating with GitHub.

UCloud and the humanities

When teaching the elective courses on Cultural Data Science, Ross has encountered humanities students with no background within computational methods whatsoever. This, however, turned out to be an advantage as the students were typically open and able to adapt quickly:

Because UCloud has eliminated a lot of former technical obstacles and barriers, students can focus on learning good programming practices and the results of their research. It allows us to focus on the task at hand. The students don’t have to know how the backend works; they don’t have to be computer scientists – they are humanities students and should be able to think about humanities objects (texts, visuals etc.) using computational methods.

Assistant Professor Ross Deans Kristensen-McLachlan

As such, UCloud is “a means to an end”, Ross emphasises. Though computational background knowledge is of course far from irrelevant, the objective for the Cultural Data Science courses has been to educate the students to think critically when working with computational methods:

We are not just looking at data science methods and applying them uncritically. We try to use the students’ main expertise and encourage them to apply their subject knowledge to think critically about their results when working with computational methods. Determining the notion ‘genre’ from a classification model eg. urges the students to think critically about the notion itself – is it even something we can determine from text alone?

Assistant Professor Ross Deans Kristensen-McLachlan

Overall, the students following Ross’ courses have been extremely positive about UCloud, even though some were sceptics to begin with. Two kinds of feedback characterised the reception of UCloud from the students in general: one group fully integrated with UCloud from the start, others came to accept it as a necessary (useful) tool.

Collaborate teaching resources

Among teachers from the department of Linguistics, Cognitive Science, and Semiotics at Aarhus University, UCloud has furthermore improved the internal coherence across the department for the benefit of both students and teachers. As most teachers have moved all their material on to UCloud, students now avoid using one set of tools for one course and one set of tools for another course.

Besides the many teaching-related benefits to be gained from UCloud, Ross further emphasises the ongoing dialogue between users of UCloud and the team who maintain it:

They are very responsive to suggestions. Over the past year it’s (UCloud) become even more fully featured in terms of what you can do with it, and I don’t see that stopping any time soon.

Assistant Professor Ross Deans Kristensen-McLachlan

One potential improvement of UCloud, Ross suggests, could be the implementation of some sort of outreach program in order to get even more people to gain from the benefits:

UCloud gets rid of all the annoying things. As far as I can see there are only benefits – the minor issues are vastly outnumbered by the benefits.

Assistant Professor Ross Deans Kristensen-McLachlan
Categories
HPC Interactive HPC Supercomputing UCloud

New UCloud release

Since we discontinued our support for mounting your local folders onto UCloud using WebDAV, have we been in search of a way to allow the users of UCloud to work with their files locally without having to re-upload them to UCloud after every change. We are happy to announce that we now have a new solution that gives the possibility to synchronize your local files with your UCloud file storage.

Go to SDU eScience for more information on the release

Categories
DeiC HPC Supercomputing UCloud

Call for applications for National HPC resources

As a researcher or Ph.d. Student at a Danish university you can now apply for resources on the national HPC centers, including the Danish part of EuroHPC LUMI.

The second call for applications for regular access to resources on the national HPC centers is now open. This includes applications for the Danish part of EuroHPC LUMI. The call is open for applications from all research areas.

As part of the use of the national e-infrastructures DeiC issues calls for applications on the use of the national resources. The projects are granted resources after application and on basis of assessment of research quality and technical feasibility.

The applications are evaluated by the appointed e-ressource committe and the grants are approved by the DeiC board.

Deadline for applications are 4th of October 2022 midnight, and the resources will be available for use from 1. January 2023.

For more information visit DeiC