Redis on Ubuntu 22.04: Installation & Security

Redis on Ubuntu 22.04: Installation & Security


Redis can significantly improve the performance of your system. We will show you how to configure and back up the powerful in-memory database efficiently and securely on Ubuntu 22.04.

REmote DIctionary Server is a powerful in-memory database storage that is often used for caching and messaging purposes. In this blog post, we explain how to install and configure the software on an Ubuntu 22.04 system to ensure secure and efficient performance.

Step 1: Install Redis

First you need to install the software on your Ubuntu 22.04 system. To do this, open a terminal and execute the following commands:

sudo apt update
sudo apt install redis-server

Step 2: Configure Redis

After installation, you must edit the configuration file to ensure optimum performance and security. Open the file /etc/redis/redis.conf with a text editor of your choice:

sudo nano /etc/redis/redis.conf

You should make the following changes in the configuration file:

  • Supervised Directive: Change supervised no to supervised systemd. This allows Redis to run under the control of systemd.
  • Bind Directive: Change bind ::1 to bind, to restrict Redis to localhost and prevent external access.

Save the file and exit the editor.

Step 3: Restart Redis

After you have changed the configuration, restart the software to apply the changes:

sudo systemctl restart redis.service

Step 4: Secure Redis


Take the following steps to protect your server from unauthorized access:

  • Password protection: Add a requirepass directive to the configuration file and set a strong password.
  • Firewall settings: Make sure that your server is not publicly accessible. Use firewall rules to restrict access.

Step 5: Check installation

Use the following client to check whether the in-memory database storage is working properly:


After entering the password (if set), you should be able to execute commands and interact with your instance.


Installing and securing Redis on Ubuntu 22.04 is a straightforward process that can significantly improve the performance of your system. By following these steps, you will ensure that your Redis server runs securely and efficiently.

NexGen Cloud presents Europe's first AI supercloud

NexGen Cloud presents Europe's first AI supercloud

With its 1 billion dollar AI supercloud, NexGen Cloud has ushered in a new era of AI in Europe. The ambitious project sets new standards in terms of sustainability and promises to significantly increase the performance and efficiency of European technology companies.

In a groundbreaking move to strengthen AI infrastructure in Europe, NexGen Cloud, a sustainable infrastructure as a service provider, has launched one of the first AI supercloud projects in Europe. With an investment of $1 billion, the UK-based company has created a specialized, compute-intensive platform for technology companies, organizations and governments in Europe. 576 million dollars alone has been committed to hardware orders from suppliers. NexGen Cloud is thus positioning itself as a leading player in the European AI sector.

Accelerated and sustainable computing for Europe

NexGen Cloud aims to meet the growing demand for accelerated computing with its AI Supercloud, launched in October. In particular, it is driven by the increased interest in generative AI and other AI applications to drive innovation and efficiency in the technology industry. It also ensures regional and cost-effective access to GPU cloud services for European companies and start-ups. By June 2024, the cloud will have over 20,000 NVIDIA H100 Tensor Core GPUs, making it one of the world's most powerful platforms.

A key aspect of the AI supercloud is its sustainable focus. It is powered exclusively by European data centers that are supplied with 100% renewable energy. This underlines NexGen Cloud's commitment to sustainable technology solutions in industries such as healthcare, finance, media and entertainment. Compliance with European data protection and security guidelines is also particularly noteworthy.

Partnership and access

To support the funding for its AI supercloud, NexGen Cloud has partnered with Moore and Moore Investments Group (MMI). A dedicated fund has also been set up and has already attracted investment from their private investors.

Access to the AI Supercloud is provided via NexGen Cloud's Hyperstack platform for the first 12 months. This is an NVIDIA GPU-accelerated cloud platform that provides direct access to computing resources for the European market.

Overall, NexGen Cloud's AI Supercloud marks a significant step forward for AI technology in Europe. The strong focus on sustainability, performance and data protection sets new standards.

Source: Cloud Computing News

NoSQL database management systems and models in comparison

NoSQL database management systems and models in comparison


Document-oriented, key-value, column-oriented and graph databases are revolutionizing data management. Discover which NoSQL database management system or model best suits your specific application needs.

In the world of database technology, NoSQL database management systems have become increasingly important in recent years. These systems offer alternative models for storing and retrieving data that differ from traditional relational databases.

A general introduction to databases you can find here. In this blog post, we will compare the different types of NoSQL databases and their respective models to understand how they can be used in different use cases.

NoSQL database models


  • Document-oriented databases store information in documents (usually in JSON format), which are more flexible than the rigid tables of relational databases. Popular examples are MongoDB and Couchbase. They are ideal for applications with variable data structures and fast development cycles.
  • Key-value databases are the simplest form of NoSQL databases and store data as key-value pairs. Redis and DynamoDB are well-known examples. These databases offer extremely fast read and write access and are well suited for session storage, caching and in scenarios where fast access to large amounts of data is required.
  • Column-oriented databases store data in columns instead of rows, which enables more efficient querying and storage of large amounts of data. Cassandra and HBase are well-known representatives of this category. They are particularly suitable for analytical applications where large data sets need to be queried quickly.
  • Graph databases are designed for storing and querying relationships between data. Examples are Neo4j and Amazon Neptune. They are mainly used in social networks, for recommendation systems and in complex network analyses.

Comparison and areas of application


  • Flexibility: Document-oriented and key-value databases offer more flexibility in terms of data structures compared to column and graph databases.
  • Scalability: Column-oriented and key-value databases are known for their high scalability, which makes them suitable for big data applications.
  • Query complexity: Graph databases offer advanced query capabilities for complex relationship networks, while document-oriented databases offer a good balance between flexibility and query complexity.
  • Performance: Key-value and column-oriented databases generally offer higher performance for read and write operations than document-oriented and graph databases.



The choice of a NoSQL database depends heavily on the specific use case. Document-oriented databases are ideal for flexible data structures and rapid development. Key-value systems are suitable for applications that require high performance with simple queries. Column-oriented databases are the best choice for analytical applications with large data sets, while graph databases are unsurpassed in scenarios where complex relationship networks are in the foreground.

In either case, it is important to understand the specific requirements and goals of a project in order to select the most appropriate database model.

Technology failures and complex clouds cost revenue

Technology failures and complex clouds cost revenue

A new survey by Hitachi Vantara reveals alarming facts. Find out how IT as a Service (ITaaS) can save you from losing revenue due to technology failures and what obstacles companies need to overcome to succeed in the digital era.

Today's business world is heavily reliant on technology. New data shows that over 40% of organizations are losing revenue due to technology failures and complex cloud systems. These alarming findings come from a survey conducted by Hitachi Vantara, Hitachi's advanced infrastructure, data management and digital solutions subsidiary. The survey included 213 IT executives in North America and Europe. It shows that 55% of organizations are struggling to gain meaningful insights from their data.

Challenges and obstacles for companies

The main issues identified in the survey are security concerns, inflexible systems, siloed data, skills shortages and the need for more flexible infrastructure. These challenges are compounded by the increasing complexity of data, outdated technologies and rising costs due to old or obsolete infrastructures.

Another worrying finding from the survey is that 56% of organizations are experiencing significant revenue losses due to technology failures. Half of organizations are struggling with high total cost of ownership (TCO) or technical debt associated with key applications. 45% struggle to navigate complex cloud landscapes.

Gary Lyng, Vice President of Product and Solutions Development at Hitachi Vantara, commented on the findings: "In today's digital era, IT is not just a department, but a driving force that enables progress. It enables organizations to innovate, collaborate and thrive in an ever-evolving technological environment. But complexity hinders innovation, emphasizing the need for trusted specialists to simplify the setup for seamless access to data and applications."

Solutions through IT as a Service (ITaaS)

The survey also shows that 42% of executives are expanding their use of IT as a Service (ITaaS). Organizations are using ITaaS to shift their IT funding models from capital budgets to operational costs that are easier to predict and budget. This has resulted in an average 20% reduction in total cost of ownership. Over the next three years, the use of ITaaS is expected to increase to 86%. The report emphasizes the importance of choosing the right partner for success.

Outlook for the future

Overall, this survey shows that companies are struggling with the increasing complexity of the technology landscape. Solutions such as ITaaS can help to overcome these challenges, minimize technology failures and increase flexibility and innovation. It is crucial to choose the right partners and models in order to move successfully into the digital future.

Source: Cloud Computing News

RDBMS in Comparison: SQLite vs. MySQL vs. PostgreSQL

RDBMS in Comparison: SQLite vs. MySQL vs. PostgreSQL

SQLite, MySQL or PostgreSQL? We compare the three leading RDBMSs to help you decide which system is best suited to your project requirements.

SQLite, MySQL and PostgreSQL are three of the best-known relational database management systems (RDBMS). Each has its own strengths and areas of application. In this blog post, we compare these three systems in terms of various aspects such as performance, use cases and range of functions.


SQLite is a lightweight, C-embedded database. It is ideal for applications that require a simple, discretely embedded database solution.


  • Lightweight: No separate server installation required.
  • Simplicity: Easy integration into applications, especially useful for mobile applications and small desktop projects.
  • Little configuration: Almost no configuration required.


  • Limited scalability: Not ideal for very large data volumes.
  • Limited competitiveness: Not designed for highly parallel access.


MySQL, which is managed by Oracle, is one of the most popular open source RDBMS for web applications.


  • Widely used: A large community and extensive support.
  • Good performance: Efficient for web-based applications.
  • Scalability: Suitable for larger applications.


  • Licensing: Can be complicated for commercial use.
  • Less advanced functions: Compared to PostgreSQL, some advanced functions are missing.


PostgreSQL, often referred to as Postgres, is an advanced open source RDBMS.


  • Advanced functions: Supports advanced database functions such as complex queries and relational integrity.
  • High conformity with standards: Very close to SQL standards.
  • Expandability: Can be customized for specific requirements.


  • Complexity: Can be a little overwhelming for beginners.
  • More resource-intensive: Requires more system resources than SQLite.


The choice of the right RDBMS depends on the specific requirements of the project. SQLite is excellent for smaller applications and is extremely lightweight. MySQL is a solid choice for web applications and offers a good balance between functionality and simplicity. PostgreSQL, on the other hand, is ideal for complex, large database applications where advanced features are required.

Each of these systems has its justification and use cases, so the decision ultimately depends on the specific requirements and goals of each project.

Identity360 revolutionizes Identity Management

Identity360 revolutionizes Identity Management

ManageEngine presents Identity360, an innovative, cloud-native identity management platform. It is designed to tackle modern challenges in corporate IT.

ManageEngine, the enterprise IT division of Zoho Corporation, has launched Identity360, a groundbreaking cloud-native identity management platform. This innovation promises to overcome the increasingly complex challenges of identity and access management (IAM) in companies.

ADManager Plus: New functions for secure compliance

In addition to Identity360, ManageEngine has also announced new features for ADManager Plus, an identity management and administration solution. By introducing access certification and identity risk assessment, the company strengthens organizations' compliance structures and security mechanisms.

Focus on digital evolution of the workforce

The digital transformation of the workforce is progressing steadily. It strives for mobility and security at the same time. Effective management of identity complexities and ensuring regulatory compliance with an optimal user experience are becoming more crucial than ever.

Identity360: Centralized administration and security

Identity360 provides a centralized platform that integrates directory services and applications to facilitate the management of user identities and effectively enforce access controls. This helps companies optimize their business processes through comprehensive identity lifecycle management and workflow orchestration.

Access certification and risk assessment for Active Directory

The new functions in ADManager Plus not only promote cyber security. They also serve to anticipate risks and provide immediate measures to mitigate potential threats. Access certification campaigns support the exercise of segregation of duties and the application of the principle of minimal rights assignment.

Highlights of ManageEngine's IAM solutions

ManageEngine emphasizes the many possibilities of its IAM solutions: from a centralized universal directory and consolidated management of user identities to MFA-secured single sign-on (SSO) and enhanced security measures.

Source: Cloud Computing News

SQL Constraints: The Guardians of Data Integrity

SQL Constraints: The Guardians of Data Integrity

In the world of databases, SQL constraints play a crucial role in ensuring the integrity and quality of data. Dive into the foundation of reliable data management with us!

SQL constraints are rules that are applied to the data in a database to ensure its accuracy and reliability. They help to define and maintain the relationships between tables. As such, they are an essential element in database integrity.

The main types of SQL constraints

  • PRIMARY KEY: This constraint uniquely identifies each row in a table. Only one primary key can be assigned to each table. No two rows can have the same value for the primary key.
  • FOREIGN KEY: A foreign key is one or more columns that are used to create a link between the data in two tables. It refers to the primary key of another table and ensures referential integrity.
  • NOT NULL: This constraint ensures that a column cannot have a NULL value. A value must be specified for this column when inserting or updating data in the table.
  • UNIQUE: The UNIQUE constraint guarantees that all values in a column are different. In contrast to the primary key, a table can have several UNIQUE constraints and it is also permitted to have NULL values.
  • CHECK: This constraint ensures that all values in a column meet certain conditions. For example, you could use a CHECK constraint to ensure that a person's age is always greater than 18.
  • DEFAULT: The DEFAULT constraint can be used to set a default value for a column if no value is specified.
  • INDEX: Although an index is not a constraint in the strict sense, it is often used in conjunction with constraints to improve query performance, especially when frequently searching or filtering data.

Understanding and properly applying SQL constraints are critical to developing and maintaining a robust database. Not only do they help maintain data integrity, but they can also increase the efficiency of database queries and prevent data management errors. When modeling data, it is important to choose constraints carefully so that the database correctly reflects and supports the business logic.

Biopharma im Zeitalter von Daten und KI

Biopharma in the age of data and AI

Biopharma in the age of data and AI


In a recent study, Benchling reveals the opportunities and challenges facing the biopharma industry in the age of data and AI. We have summarized the most important information for you.

Benchling's "2023 State of Tech in Biopharma" study shows how biopharma companies are using technology in an era of data and artificial intelligence (AI) and what obstacles they need to overcome. The study surveyed 300 research and IT professionals from biopharma companies. It was the first to examine how the industry is deploying technology stacks consisting of robotics and automation, networked instrumentation, research data platforms, cloud-based scientific applications, and AI and machine learning.

Technology adoption in the biopharma industry

The biopharma industry has made great strides in implementing technologies. The study shows that 70% of respondents use research data platforms. Robotics and automation platforms follow in second place with 63% and AI/ML with 59%. However, the use of SaaS software lags behind. Only 18% of respondents reported using it for most of their research and IT work.

This high technology use is also reflected in the complex technological environment of the labs. Over half of scientists report using more than five different scientific software applications daily. Meanwhile, 40% of IT staff in large organizations support more than 20 applications. Consequently, 84% of organizations still rely on custom software solutions. This, in turn, indicates outdated systems.

Collaboration and data availability challenges

The survey also highlights the challenges in terms of collaboration. Around 41% of scientists find collaboration with other teams problematic. Yet 38% of survey participants must collaborate with 20 or more people on a daily basis. Despite the use of modern technologies, the FAIR data principles of interoperability (I) and reusability (R) still seem out of reach. Only 28% of respondents said they had achieved organization-wide data interoperability. Data reusability was mentioned by 30%.

The greatest barriers to technology adoption

According to the study, the two biggest barriers to technology adoption in the biopharma industry are a lack of qualified personnel and a lack of solutions tailored to science. This underscores the industry's urgent need to attract qualified professionals who are willing to develop new software solutions tailored to the needs of science.

Necessary steps to improve the situation

Benchling's research shows that companies need to take a critical step to foster greater alignment between their research and IT departments. This also means rationalizing different technology priorities and being honest about the real barriers to technology adoption. These include cultural and organizational changes, among others. In the coming years, the importance of scientific data and AI will continue to grow. This will increase pressure on the biopharma industry to abandon outdated technologies and build a solid digital foundation.


Benchling's "2023 State of Tech in Biopharma" study highlights the opportunities and challenges facing the biopharma industry when it comes to implementing and leveraging cutting-edge technologies. Attracting talented professionals is essential to remain competitive and accelerate scientific progress. In addition, collaboration between research and IT must be fostered and appropriate technologies must be implemented. This is the only way companies can meet the demands of the digital age in the biopharma industry and ensure long-term success.

Source: Cloud Computing News

Eine Einführung in Datenbanken

An Introduction to Databases

An Introduction to Databases


Databases are the backbone of modern data processing and storage. We give you an introduction to the topic and explain why databases are so important.

Databases are the foundation for storing and managing information in the digital era and consequently a fundamental part of our modern world. Be it for businesses, government agencies or even in our everyday lives: Database systems play a crucial role in the storage, organization and management of data.

What exactly is a database?


A database is a structured collection of information or data stored in electronic form. This data can range from text and numbers to images and multimedia content. Databases are used to efficiently organize, store and retrieve information. They are an indispensable tool when it comes to managing large amounts of data.

Types of databases?


There are several types of databases, including:

  • Relational databases store data in tables called "relations". They use SQL (Structured Query Language) to query and manage data. Popular relational database management systems (DBMS) are MySQL, PostgreSQL and Microsoft SQL Server.
  • NoSQL databases do not use tables, but store data in various formats such as documents, columns or key-value pairs. They are particularly useful when processing unstructured data. Examples are MongoDB and Cassandra.
  • Graph databases specialize in representing relationships between data points. They are often used in social networks, recommendation systems, and knowledge management. Examples of graph databases are Neo4j and Amazon Neptune.
  • Document databases store information in documents, usually in JSON or XML format. They are particularly useful for data-intensive applications and content management systems. Examples include MongoDB and CouchDB.

Why are databases important?

  • Data organization: They help to store information in a structured and efficient way. They allow data to be organized into well-defined categories and relationships.
  • Data integrity: They provide mechanisms to ensure data integrity by ensuring that stored information is consistent and accurate.
  • Data query: Using query languages such as SQL, users can retrieve and analyze data in a targeted manner. This is critical for decision making and reporting.
  • Scalability: They are scalable, which means they can handle growing volumes of data. This is of great importance for companies.
  • Security: Provide security features to protect access to sensitive information and keep data from unauthorized access.

Cloud-Infrastruktur-Ausgaben im Aufwind. Spending on cloud infrastructure is on the upswing.

Spending on cloud infrastructure is on the upswing

Spending on cloud infrastructure is on the upswing


Spending on cloud infrastructure increased by nearly 8% in Q2 2023, while spending on non-cloud hosted infrastructure declined. These trends show: The need for robust configurations for complex workloads and AI initiatives is growing.

Spending on compute and storage infrastructure products in cloud deployments saw impressive year-on-year growth of almost 8% in the second quarter of 2023. Thus, according to the latest data from IDC, they rose to 24.6 billion US dollars (20.36 billion euros).

Shared cloud infrastructure tops the winners list

Particularly notable is the growth in shared cloud infrastructure, where spending rose 13.7% year-on-year to $17.9 billion in the latest quarter. However, this increase outpaced spending on non-cloud hosted infrastructure, which fell 8.3% to $14.4 billion. Meanwhile, shared cloud infrastructures account for nearly half (45.8%) of total infrastructure spending, while dedicated cloud infrastructure has decreased by 4.9%.

Forecast: Continued growth for cloud infrastructure and challenges for non-cloud hosted infrastructure

IDC's forecasts are based on the Worldwide Enterprise Infrastructure Tracker. For 2023, they foresee a growth of 10.6% in cloud infrastructure spending to 101.4 billion US dollars. Non-cloud hosted infrastructure, on the other hand, is expected to decline 7.9% to $58.5 billion.

IDC emphasized that the forecast for non-cloud hosted infrastructure is "cautious" due to expected challenges. Still, overall cloud spending is expected to "remain positive due to new and existing mission-critical workloads that often require performance-oriented systems."

Future trends: focus on complex workloads and AI initiatives

Juan Pablo Seminara, Research Director at IDC, highlighted that "cloud infrastructure spending is shifting toward higher-performance configurations targeted at more complex workloads and new AI initiatives." Despite a drop in demand in the first half of the year, the outlook for 2023 remains positive. Higher average selling prices are expected.

Long-term trends continue to point upward. IDC forecasts that cloud infrastructure spending will reach $156.7 billion by 2027. This represents an annual growth rate of 11.3% and accounts for nearly 70% (69.4%) of total compute and storage infrastructure spending.

The future of cloud infrastructure spending looks promising. Enterprises are increasingly investing in high-performance systems to meet the demands of complex workloads and AI initiatives. As a result, cloud infrastructure remains on a growth trajectory and is expected to continue to outpace non-cloud hosted infrastructure.

Source: Cloud Computing News