Categories
Interactive HPC Research UCloud Use case

DeiC Interactive HPC Crucial for Danish AI Language Models

By Jasper Riis-Hansen and Line Ejby Sørensen, Center for Humanities Computing (CHC), Aarhus University

DeiC Interactive HPC – UCloud plays a central role in the Danish Foundation Models (DFM) project, which forms part of the Danish government’s strategic initiative for artificial intelligence.

Danish Foundation Models (DFM) is supported by the Ministry of Digital Affairs as part of the national AI strategy, which aims to ensure that Denmark has access to advanced and tailored language models. These models are intended for use across a wide range of sectors, including healthcare, public administration, education, and private enterprise.

A shared digital environment

The DFM project brings together Danish universities, research institutions, and industry partners in a joint effort to establish new standards for ethically responsible and inclusive AI language technologies.

The project is a collaboration between Aarhus University, the University of Copenhagen, the University of Southern Denmark, and the Alexandra Institute. DeiC Interactive HPC – UCloud plays a vital role in this work by providing high data security, scalable computing power, and, not least, an accessible, secure, national cloud platform that enables collaboration among project partners.

“UCloud forms the foundation for an important step in research digitalisation, as the platform provides easy access to computing power, enabling scalable data analysis and modelling, while also offering a secure environment for handling sensitive data. The platform also facilitates collaboration across institutions and allows us to manage data access as needed. This is particularly relevant in the DFM project, which includes many partners participating at different levels.”
Postdoc Kenneth Enevoldsen

Data security and computing power

Because AI models are often trained on sensitive data, it is crucial that data processing complies with both GDPR and Danish security standards. UCloud is ISO27001-certified and specifically designed to meet both Danish and EU requirements for secure data handling.

“In the DFM project, we work with very large amounts of data from a variety of sources – including sensitive data that the models are trained on – and this places high demands on data security. That is why UCloud is such a valuable tool for the project – precisely because of its high level of data security and access to scalable computing power.”
Postdoc Kenneth Enevoldsen

Although DFM also makes use of European supercomputers such as LUMI in Finland and Leonardo in Italy, the day-to-day operations of the project are heavily reliant on UCloud. In addition to being a springboard for high-performance computing, UCloud also provides a secure and user-friendly platform with a wide range of accessible applications – all essential for daily research, collaboration, data processing, and innovation across the project’s interdisciplinary team.

Critical infrastructure for Danish AI development

DFM’s principal investigators, Kristoffer Nielbo and Peter Schneider-Kamp, emphasise that the robust digital research environment provided by DeiC Interactive HPC – UCloud constitutes critical infrastructure. It streamlines workflows, enhances collaboration, and accelerates the development of both language and AI technologies.

“Without UCloud, the DFM project would have had to develop this type of digital infrastructure itself – with significant time and financial costs. The platform’s role in the project clearly demonstrates how robust, collaborative digital research environments are essential to Denmark’s AI strategies.”

Danish Foundation Models (DFM) is a collaborative project involving Aarhus University, the University of Copenhagen, the University of Southern Denmark, and the Alexandra Institute.

The project is supported by the Ministry of Digital Affairs with a grant of DKK 30.7 million and aims to develop advanced language models with open access and transparent development processes.

The models are specifically tailored to Danish and other Scandinavian languages and cultures and are intended for use across sectors such as healthcare, public administration, education, and business.

DFM seeks to establish a new standard for ethically responsible, inclusive, and transparent AI language technology – for the benefit of both Danish society and the research community.

For more information, visit: Danish Foundation Models, Ministry of Digital Affairs press release

Categories
Interactive HPC Research Supercomputing UCloud

DeiC Interactive HPC Revolutionises Interdisciplinary Research with User-Friendly Supercomputing Access

With 10,000 users, DeiC Interactive HPC has established itself as one of Europe’s most popular HPC facilities, thanks to an unprecedented democratisation of access to advanced computing resources. These resources, once reserved for specialised research fields and technically adept specialists, are now accessible to any researcher with a dataset and a vision.

Through a newly developed, simple, and graphical user interface, DeiC Interactive HPC, also known as UCloud, makes it easier than ever to gain interactive access to supercomputing. This approach reduces technical barriers and enhances research collaboration by offering shared, easily accessible virtual environments. As a result, DeiC Interactive HPC supports dynamic and interdisciplinary research, accelerating research processes and promoting innovation in fields ranging from bioinformatics to digital humanities.

Democratising Access to HPC

The trend towards more interactive use of technology, including HPC, reflects efforts to make the STEM field more inclusive and accessible, mirroring broader societal changes towards diversity and inclusion in technology and science. DeiC Interactive HPC’s user-friendly approach has attracted a broad spectrum of users, including those from nearly all Danish universities and individuals with varying levels of technical expertise, notably many students.

“We are proud to highlight the growing diversity among DeiC Interactive HPC users, a development that further distinguishes DeiC Interactive HPC from traditional HPC systems. We see continuous growth in user numbers and are now celebrating surpassing 10,000 users across a very broad spectrum of research disciplines, which is impressive in the HPC field. Of these users, 50% are students, reflecting DeiC Interactive HPC’s success in attracting new users and serving as a bridge to larger European HPC facilities,” says Professor Kristoffer Nielbo, representing Aarhus University in the DeiC Interactive HPC Consortium.

By simplifying access to supercomputers, DeiC Interactive HPC democratises powerful data processing resources, enabling a wider range of researchers and academics to conduct innovative research without the steep learning curve traditionally associated with HPC. This inclusivity fosters scientific collaboration and creativity, enriching the HPC community with a diversity of perspectives and ideas.

“We continuously work to improve DeiC Interactive HPC with a democratic approach, using user feedback to ensure our focus is in the right place. This is also reflected in our new update – UCloud version 2 – which aims to increase efficiency and improve the user experience for researchers. It is part of our DNA as an interactive HPC facility to always keep the user in mind and develop apps and user interfaces based on user needs. Therefore, we encourage our users to reach out to us with their wishes and ideas,” says Professor Claudio Pica, representing the University of Southern Denmark in the DeiC Interactive HPC Consortium.

An All Danish and Highly Secure System

Despite its internationally sounding name, UCloud, DeiC Interactive HPC is part of the Danish HPC landscape, funded by Danish universities and the Ministry of Education and Research. The increased focus on developing a new generation of highly user-friendly applications means that researchers and other university staff can now use intuitive applications for transcribing sensitive data via DeiC Interactive HPC.

“DeiC Interactive HPC has already developed applications based on the same transcription technology found online and made them available in a secure environment through the UCloud platform. These transcription applications are just the beginning of a series of targeted secure applications that do not require prior experience, and we are always open to user input and ideas that arise from their unique needs but often prove beneficial to many,” says Lars Sørensen, Head of Digitalisation, representing Aalborg University and CLAAUDIA in the DeiC Interactive HPC Consortium.

By making advanced data processing more accessible to researchers from various disciplines, DeiC Interactive HPC helps break down the technical barriers that previously limited access to these resources. With an increasing number of students and new users from diverse backgrounds combined with continuous engagement in user-centred innovation, DeiC Interactive HPC not only supports the academic community but also plays a crucial role in promoting a more inclusive and productive research environment.


For further information and high resolution graphics, contact:
Kristoffer Nielbo, Director of Center for Humanities Computing, Aarhus University, 26832608 kln@cas.au.dk

UCloud offers access to advanced tools such as quantum simulation apps and H100 GPUs as well as applications aimed at data analysis and visualisation.

In data analysis, Python and Jupyter notebooks are particularly prominent, catering to the interactive, ad hoc, and data-centric workflows common in the field. These tools are highly valued for their user-friendliness in handling rapidly changing software environments and offer rich user interfaces, a significant advantage compared to traditional HPC setups, which can be more complex or less flexible.

Furthermore, the integration of tools such as Conda for managing software packages, Jupyter notebooks, Rstudio, Coder, and Dask for parallel computing significantly enhances the usability of HPC resources for interactive and on-demand data processing needs. These tools help bridge the gap between the hardware of complex HPC systems and the user-friendly software environments that data scientists require.

About DeiC Interactive HPC

Use Cases and News

News About the New UI

DeiC Interactive HPC (UCloud) is a successful collaboration between three universities: SDU, AU, and AAU.

Aalborg University, CLAAUDIA, represented by Lars Sørensen

SDU, eScience Center, represented by Professor Claudio Pica

Aarhus University, Center for Humanities Computing, represented by Professor Kristoffer Nielbo

Categories
Interactive HPC Research Supercomputing UCloud

Video use case: HPC enlightens researchers in social sciences and humanities about human behavior

Sociologist Rolf Lyneborg Lund has trained an image AI using DeiCInteractive, which can help us understand how people perceive the concepts of “good” and “bad” neighbourhoods.

Visit deic.dk to view video use case from the 2023 DeiC Conference

Categories
Interactive HPC Research Supercomputing

State-of-the-art GPUs for AI available through DeiC Interactive HPC

AI companies around the world are scrambling to get their hands on the latest and most powerful NVIDIA GPU called H100. The biggest costumers include OpenAI, Microsoft and Google. Now, 16 NVIDIA H100 GPUs have landed at SDU, ready to be integrated into the DeiC Interactive HPC system. With the arrival of 4 servers with 4 H100 GPUs each at SDU, Danish researchers will be able to access the same hardware coveted by some of the biggest tech companies in the world.

Go to story

Image: NVIDIA Hopper H100 GPU. Credit: NVIDIA

Categories
Interactive HPC Research Supercomputing Uncategorized Use case

Utilizing agent-based models in archaeological data   

Supercomputing has long been associated with areas such as physics, engineering, and data science. However, researchers in humanities at Aarhus University are increasingly turning to supercomputing allowing them to delve into unexplored territories and discover new insights.
From analysing historical archives to simulating ancient civilizations to analysing social media data, supercomputing offers unique opportunities to generate insights and advance knowledge in humanities.

In this article series, we highlight three cases with humanities researchers from Aarhus University that illustrate the varied ways in which supercomputing is being used in humanities research.


Iza Romanowska is assistant professor at Aarhus University working at the Aarhus Insitute of Advanced Studies where she studies complex ancient societies.

To overcome the challenges of limited data from these ancient societies, researchers have started utilizing Agent-based model (ABM) sometimes enabled by supercomputing. ABMs are computational models that simulate the behaviour and interactions of individual entities, known as agents, within a specified environment or system. Each agent in the model is typically programmed with a set of rules or algorithms that control its behaviour, decision-making processes, and interactions with other agents and the environment.

ABM is a valuable tool in archaeology that allows us to simulate and analyse the behaviours and interactions of individuals or groups in past societies, and the use of ABM allows comparison of the model against real archaeological data.

Assistant Professor Iza Romanowska

In one of Iza Romanowska’s studies, agent-based modelling (ABM) made it possible for her and her colleagues to explore the Roman economy in the context of long-distance trade, using ceramic tableware to understand the distribution patterns and buying strategies of traders in the Eastern Mediterranean between 200 BC and AD 300.  

The potential of supercomputing in humanities becomes particularly evident when studying such societies with only limited data as experienced by archaeologists and historians. Iza Romanowska explains that the availability of data is limited in her field compared to other disciplines, stating that while social scientists studying more contemporary populations have access to abundant amounts of data such as the number of traders, transactions, and values, “we have none of this information.” Therefore, the use of HPC has been essential for her research.  

ABM as methodological tool necessitates running the simulation many times, and by many, I mean eight hundred thousand times, and that is possible with a laptop… if one plans to be doing their Ph.D. for 500 years. Supercomputing is bigger, faster, better without any qualitative change in terms of the research.

Assistant Professor Iza Romanowska

Using a high-performance computer like the DeiC Interactive HPC system enhances the scalability and speed of ABMs, allowing researchers to gain deeper insights into the behavior and outcomes of complex systems. The DeiC Interactive HPC facility hosts out-of-the-box tools, like NetLogo, for working with ABM. Researchers can also use ABM frameworks for Python or R in one of the many development apps like JupyterLab or Coder.  

Supercomputing and coding as research tools advance humanities research 

While humanities data in general is plentiful and can be analysed effectively, Iza Romanowska finds that there is a gap in understanding the underlying processes that generate the observed patterns, resulting in underdeveloped explanatory frameworks. Her point is that the lack of formal tools for theory building and testing remains a major disciplinary issue. 

“Within humanities including archaeology and history, data analysis is well-established. However, there’s a kind of fundamental disciplinary problem with that we don’t have or use many computational tools for theory building and theory testing. Supercomputing as a tool for the humanities can contribute to fill this gap and strengthen theory building and ultimately it can advance the field of humanities research.”  

Assistant Professor Iza Romanowsk

Iza Romanowska believes that more people in humanities should learn to code to take advantage of the possibilities offered by their data. She suggests that supercomputing can be a natural progression from this. While many humanities researchers may not feel like they need supercomputing, perhaps they are simply not asking questions that could benefit from high-performance computing (HPC). 

I would especially encourage junior researchers in the humanities to embrace supercomputing. It never hurts to acquire a skill, and many of these tools are becoming so easily available that it’s almost a shame to not use them.


You have just read the second of three cases in our series on Interactive HPC usage in humanities.
Through these compelling cases it becomes evident that supercomputing in humanities research is transforming traditional approaches, empowering researchers to uncover new insights and deepen our understanding of the field.  It opens doors to interdisciplinary collaborations and expands the possibilities for data analysis and modelling, ultimately shaping the future of digital humanities. 

Stay tuned for our third case featuring Rebekah Baglini representing her field of linguistics and check out the first case featuring Katrine Frøkjær Baunvig and the case of creating a Grundtvig-artificial intelligence using HPC

Categories
Call Research Supercomputing UCloud

Apply for HPC resources

Researchers at a Danish university have various options for gaining access to computing power at both Danish and international HPC facilities. Front office personnell, please inform your users that the fall call H1-2024 is now open for applications for access to the e-ressources.

Information about the call is found on DeiC’s website.

Categories
Interactive HPC Research Supercomputing UCloud Use case

Creating a Grundtvig-artificial intelligence using HPC

Beyond Tradition
Unveiling the Uses of Supercomputing in Humanities. 

Supercomputing has long been associated with areas such as physics, engineering, and data science. However, researchers in humanities at Aarhus University are increasingly turning to supercomputing allowing them to delve into unexplored territories and discover new insights.
From analysing historical archives to simulating ancient civilizations to analysing social media data, supercomputing offers unique opportunities to generate insights and advance knowledge in humanities.

In this article series, we highlight three cases with humanities researchers from Aarhus University that illustrate the varied ways in which supercomputing is being used in humanities research. 


Katrine Frøkjær Baunvig, head of the Grundtvig Center at Aarhus University has used supercomputing as a methodological approach, and it has led her to non-trivial conclusions that significantly impact our understanding of of 19th-century nation builder and prominent pastor N.F.S. Grundtvig ‘s vast body of works and his immense influence on Danish culture.  

In order to conduct a certain type of text mining, so-called word embeddings, she has created an artificial intelligence of Grundtvig, enabling a comprehensive analysis of his over 1000 works and 8 million words, resulting in unprecedented insights.

Grundtvig’s worldview: analysed by Katrine Frøkjær Baunvig in the upcoming paper ”Benign Structures. The Worldview of Danish National Poet, Pastor, and Politician N.F.S. Grundtvig”.

This approach has ushered in a completely new era in Grundtvig research, according to Katrine Frøkjær Baunvig. She dismisses the criticism of digital humanities sceptics who argue that word embedding fails to consider the surrounding context of words. 

“This type of rejection is prevalent only among researchers who have not taken the time to understand or familiarize themselves with the current state and level of the research. When creating a word embedding, I obtain a vast mapping of a given word’s extensive association structure. Therefore, I can clearly discern different semantic focal points and contexts where the word appears in Grundtvig’s body of work. This is precisely what allows me to gain an overview.” 

Katrine Frøkjær Baunvig, Head of the Grundtvig Center at Aarhus University

Katrine Frøkjær Baunvig opted to form a research partnership with the Center for Humanities Computing at Aarhus University. Her best advice for other researchers going into supercomputing in the humanities is to team up with the right people.  

“Stepping into the world of supercomputing requires an approach to work processes that, in my opinion, represents a new trend in the humanities, namely, interdisciplinary collaborations and team-based publishing. Someone takes care of what is typically called the domain expert area – in this case, knowledge of Grundtvig’s authorship – while others handle the more technical aspects of execution.

Katrine Frøkjær Baunvig, Head of the Grundtvig Center at Aarhus University

She also emphasises the importance of comprehending the workings of the tools to better harness the power of supercomputing.  

“Even if you may not be able to train your algorithm yourself, it can be very practical to devote time and energy to obtain an operational understanding of the steps involved in creating a Grundtvig-artificial intelligence and the various types of applications such an intelligence can be used for.”

Katrine Frøkjær Baunvig, Head of the Grundtvig Center at Aarhus University
Grundtvig’s use of colour terms confirming his claim written to Ingemann: That one cannot paint Christ with colour. A point unfolded in another upcoming paper ”Med Farver kan man ingen Christus male” En komputationel udforskning af farvebrugen i Grundtvigs forfatterskab” by Katrine Frøkjær Baunvig.

With years of experience in using supercomputing in her research, Katrine plans to continue using it and encourages others to do so when it seems fit. Especially in times where humanities research is often dismissed as lacking scientific rigor, Katrine Frøkjær Baunvig sees an opportunity to make an impact.  With a keen sense of responsibility to bring her field forward, she is determined to prove that humanities research can be just as methodical and rigorous as research in any other discipline.  

“Researchers who have pioneering eagerness should explore supercomputing as it can give them a head start by venturing into “blue ocean” territory.” 

Katrine Frøkjær Baunvig, Head of the Grundtvig Center at Aarhus University

Katrine Frøkjær Baunvig has used the DeiC Interactive HPC system for a range of NLP tasks such as linguistic normalisation of historical Danish, semantic representation learning and inference, and finally, historical chat bot development based on custom Large Language Model for Danish. 


You have just read the first of three cases in our series on Interactive HPC usage in humanities.
Through these compelling cases it becomes evident that supercomputing in humanities research is transforming traditional approaches, empowering researchers to uncover new insights and deepen our understanding of the field.  It opens doors to interdisciplinary collaborations and expands the possibilities for data analysis and modelling, ultimately shaping the future of digital humanities. 

Stay tuned for our second and third case featuring Iza Romanowska and Rebekah Baglini representing their fields of archaeology and linguistics .

Categories
Interactive HPC Research Supercomputing UCloud Use case

UCloud as a complementary HPC tool within theoretical particle physics

Though supercomputers form the key basis of his research, UCloud has been a valuable, complementary tool for Tobias and his colleagues and will most likely continue to be so in future work as well.

Post.doc. Tobias Tsang works within the broader research field of theoretical particle physics. As part of the Centre for Cosmology and Particle Physics Phenomenology (CP3-Origins) at University of Southern Denmark, his research more specifically concerns quantum field theory and quantum chromodynamics (QCD), i.a. how fundamental particles, protons and neutrons, interact with each other:

My research aims to provide high precision predictions based solely on the theory of the Standard Model – the best-known understanding of the interaction of fundamental (i.e. not containing ‘smaller constituents’) particles. This is done via very large-scale numerical simulations using the most powerful supercomputers around the world.

Post.doc. Tobias Tsang, Centre for Cosmology and Particle Physics Phenomenology (CP3-Origins) at University of Southern Denmark

Experience and achievements

More traditional mathematical methods that can be written down with pen and paper do not apply for research on quantum chromodynamics. As such, Tobias’ research relies on a method called ‘Monte Carlo’ which is applied to compute statistical field theories of simple particle systems. Though this type of research is done using very large supercomputers, Tobias has recurrently applied UCloud for exploratory studies of smaller volumes of data:

When doing large scale simulations, we sometimes do it on something called ‘10,000 cores in parallel’, and clearly this is not something we can easily do on a resource like UCloud. But for the small exploratory studies, UCloud is a nice resource in the sense that it is available; you don’t have to sit here on a hot day and burn your laptop to death – you can send it to UCloud and run it there. I think this is kind of the point where I have used UCloud the most; for small exploratory studies and some of the projects that don’t need a huge amount of computer time but still a significant portion.

Post.doc. Tobias Tsang

Though UCloud has served as a supplemental rather than a key tool in Tobias’ work together with the CP3-Origins research centre, he describes it as a nice complement to other HPC resources:

“I don’t think UCloud will ever be the only resource we use. But this is also the design of it; UCloud is not meant to be a huge machine, it is meant to be an available resource that is easy to use and that gives you a playground to set up things really from scratch where you can test things out and run smaller jobs and analyses. In that sense, it is quite complementary to a lot of the things we normally work with. For exploratory studies and for code testing, UCloud will definitely remain very useful.”

Post.doc. Tobias Tsang

At one specific project done at SDU as a collaboration between CP3 and IMADA (Institute of Mathematics and Data Science) a few years back, the vast majority of samples were generated on UCloud, and a significant amount of data production and measurements were also carried out on there [1]. UCloud needs, however, to be considered a part of a whole, according to Tobias:

“It is not that one particular machine made it possible; we would otherwise have found another machine to run it on. But UCloud provided us with a nice set up where we could just use local resources without having to go through big grant applications to get computer time.”

Post.doc. Tobias Tsang

Pros and cons

In terms of time optimization, UCloud has also been a game changer for Tobias:

One of the nice things about UCloud compared to other machines is the wall clock time: quite often, for larger clusters, depending on the cluster though, you are very much restricted by the queue policies. So, there are some clusters where you have a maximum run time of 4 hours, and if you happen to run a small job that is longer than this, then you can’t – you have to always tailor your job to fit exactly and to make the maximum use of it. On UCloud you have a 200-hour wall clock. This is very helpful as for a lot of these things that have to run sequentially, you might not need a huge resource, you just need to have a long enough time span to actually do it.

Post.doc. Tobias Tsang

Though UCloud slowed the work process down a bit in the beginning as everything had to be installed and set up, this downside was quickly resolved and overshadowed by the benefits: 

“Once you get used to it, you can kind of equalize the work process to what you would have on a cluster where everything is just readily installed.”

Post.doc. Tobias Tsang

Despite pros and cons, Tobias describes UCloud as a flexible system:

The fact that UCloud is really just a virtual machine has both positive and negative sides. The positive side is that you are really free to do whatever you want to do; you can install everything and you don’t have any restrictions that you would have on larger clusters where you can’t easily install software, or you can’t install it into the parts where you want to install it. On larger clusters, you are typically limited by the compilers that are already there. So, from that point of view, UCloud, at least to me, seems like a more flexible system. The downside is that you have to install everything; you can’t just quickly run something, you kind of have to constantly install everything from scratch.

Post.doc. Tobias Tsang

Last but not least, Tobias stresses the interaction with the UCloud front office as a major benefit that has helped the research group significantly, especially compared to other clusters with a much longer response time:

One of the nice things with UCloud as a general system is that every time something didn’t work, we got a really quick email back. Any questions we raised were answered quickly, so it was never something that kept us stuck for weeks or months – typically things were resolved in a very timely time scale. And things that we actively suggested as nice features or things that we thought were missing on UCloud were likewise addressed.

Post.doc. Tobias Tsang

[1]  Della Morte, Jaeger, Sannino, Tsang and Ziegler, “One Flavour QCD as an analogue computer for SUSY”, PoS LATTICE2021 (2022) 225, https://doi.org/10.22323/1.396.0225

Categories
Interactive HPC Research UCloud Use case

National Health Data Science Sandbox for Training and Research

UCloud is not just an ideal platform for the individual researcher who wants interactive access to HPC resources or an easy way to collaborate with national or international partners. It is also highly suitable for teaching. Jennifer Bartell and Samuele Soraggi, who are both working on the project National Health Data Science Sandbox for Training and Research, share their experiences with using UCloud.

National “sandbox” platform

The growing amounts of data in all research fields offer researchers new opportunities and possibilities for scientific breakthrough. In the case of health science, the use of large amounts of data has great potential to improve our health care – it can e.g. expand our ability to understand and diagnose diseases. One of the constraints of using health data is that many datasets (e.g. person-specific health records or genomics data) are sensitive from a patient privacy perspective and governed by strict access and usage guidelines. This can be a major challenge in particular for students or researchers who are just learning best practices in handling health data while also developing data science skills.

Go to SDU eScience for full story

Categories
Data Management Interactive HPC Research Supercomputing

Tilmeldingen til DeiC konference 2022 er åben

Nu er der åbnet for tilmeldingen til årets DeiC konference med fokus på modenhed og tilpasning.

Konferencens hovedtema er ”Alignment and Maturity: Implementing Research Infrastructure Solutions”.

Programmet er inddelt i fire spor: Data management, supercomputing (HPC), net og tjenester, samt sikkerhed. Inden for hvert spor vil der blive fokuseret på ’maturity’ og ’alignment’, samt strategier til løsninger på problemstillinger inden for forskningsinfrastrukturen.

Se program og tilmeld dig konferencen.