Open-source News

RHEL9-Derived Oracle Linux 9 Developer Preview Released With 5.15-Based UEK Kernel

Phoronix - Tue, 06/14/2022 - 17:16
Oracle on Monday released the Oracle Linux 9 Developer Preview as their take on Red Hat Enterprise Linux 9 that reached general availability last month...

FreeCAD 0.20 Released For Open-Source CAD Software

Phoronix - Tue, 06/14/2022 - 17:00
For those interested in open-source CAD solutions, FreeCAD 0.20 is out today as the newest version of this general purpose 3D computer-aided design modeler that has been in development now for nearly twenty years...

Building the metaverse with open source

opensource.com - Tue, 06/14/2022 - 15:00
Building the metaverse with open source Liv Erickson Tue, 06/14/2022 - 03:00 1 reader likes this 1 reader likes this

The word metaverse has been thrown around a lot these days. Whether you believe it's a reality or not, the adoption of the term has signaled a significant shift in the way people think about the future of online interactions. With today's technological advancements and an increase in geographically distributed social circles, the idea of seamlessly connected virtual worlds as part of a metaverse has never felt more appealing.

Virtual worlds enable a wide range of scenarios, and brings to life a rich and vibrant array of experiences. Students can explore the past by stepping inside a past time period, embodying historic figures, and interacting with buildings that were built centuries ago. Coworkers can gather for coffee chats, regardless of where in the world they're working. Musicians and artists can interact with fans from around the world in small or large digital venues. Conferences can reach new audiences, and friends can connect to explore interactive spaces.

When we built virtual world platforms (the predecessors to today's metaverse applications) in the past, there was only limited access to powerful graphics hardware, scalable servers, and high-bandwidth network infrastructure. However, recent advancements in cloud computing and hardware optimization have allowed virtual worlds to reach new audiences. The complexity of what we're able to simulate has increased significantly.

Today, there are several companies investing in new online virtual worlds and technologies. To me, this is indicative of a fundamental shift in the way people interact with one another, create, and consume content online.

Some tenets associated with the concept of the metaverse and virtual worlds are familiar through the traditional web, including identity systems, communication protocols, social networks, and online economies. Other elements, though, are newer. The metaverse is already starting to see a proliferation of 3D environments (often created and shared by users), the use of digital bodies, or "avatars", and the incorporation of virtual and augmented reality technology.

Building virtual worlds the open source way

With this shift in computing paradigms, there's an opportunity to drive forward open standards and projects encouraging the development of decentralized, distributed, and interoperable virtual worlds. This can begin at the hardware level with projects like Razer's Open source virtual reality (OSVR) schematics encouraging experimentation for headset development, and go all the way up the stack. At the device layer, the Khronos Group's OpenXR standard has been widely adopted by headset manufacturers, which allows applications and engines to target a single API, with device-specific capabilities supported through extensions.

This allows creators and developers of virtual worlds to focus on mechanics and content. While the techniques used to build 3D experiences aren't new, the increased interest in metaverse applications has resulted in new tools and engines for creating immersive experiences. Although there are many libraries and engines that have differences in how they run their virtual worlds, most virtual worlds share the same underlying development concepts.

At the core of a virtual world is the 3D graphics and simulation engine (such as Babylon.js and the WebGL libraries it interacts with). This code is responsible for managing the game state of the world, so that interactions manipulating the state of the world are shared between the visitors of the space, and drawing updates to the environment on screen. Game simulation states can include objects in the world and avatar movement, so that when one user moves through a space, everyone else sees it happening in real time. The rendering engine uses the perspective of a virtual camera to draw a 2D image on the screen, mapped to what a user is looking at in digital space.

More great content Free online course: RHEL technical overview Learn advanced Linux commands Download cheat sheets Find an open source alternative Explore open source resources

The video game world is made up of 2D and 3D objects that represent a virtual location. These experiences can vary, ranging from small rooms to entire planets, limited only by the creator's imagination. Inside of the virtual world, objects have transforms that instantiate the object to a particular place in the world's 3D coordinate system. The transform represents the object's position, rotation, and scale within the digital environment. These objects, which can have mesh geometry created in a 3D modeling program, materials, and textures assigned to them, can trigger other events in the world, play sounds, or interact with the user.

Once a virtual world has been created, the application renders content to the screen using a virtual camera. Like a camera in the real world, a camera inside of a game engine has a viewport and settings that change the way a frame is captured. For immersive experiences, the camera draws many updates every second (up to 120 frames per second for some high-end virtual reality headsets) to reflect the way you're moving within the space. Virtual reality experiences specifically also require that the camera draws twice: once for each eye, slightly offset by your interpupillary distance (the distance between the center of your pupils in each eye).

If camera rendering components of developing a virtual world sound complex, don't fret. Most libraries and frameworks for authoring immersive content have these capabilities available so you can focus on the content and interactivity. Open source game engines, such as Open 3D Engine (O3de) and Godot Engine offer these rendering capabilities and many other tools as built-in features. With open source engines, developers have the additional flexibility of extending or changing core systems, which allows for more control over the end experience.

Other key characteristics that make up the metaverse include users taking on digital bodies (often referred to as avatars), user-generated content that's created and shared by users of the platform, voice and text chat, and the ability to navigate between differently themed worlds and rooms.

Approaches to building the metaverse

Before choosing a development environment for building the metaverse, you should consider what tenets are most critical for the types of experiences and worlds your users are going to experience. The first choice you're faced with is whether to target a native experience or the browser. Both have different considerations for how a virtual world unfolds.

A proprietary metaverse necessarily offers limited connections to virtual worlds. Open source and browser-based platforms have emerged, building on top of web standards and operating through the Khronos group and W3C to ensure interoperability and content portability.

Web applications such as Mozilla Hubs and Element's Third Room build on existing web protocols to create open source options for building browser-based virtual world applications. These experiences, linking together 3D spaces embedded into web pages, utilize open source technologies including three.js, Babylon.js, and A-Frame for content authoring. They also utilize open source realtime communication protocols for voice and synchronized avatar movement.

Open access

As with all emerging technologies, it's critical to consider the use case and impact to the humans who use it. Immersive virtual and augmented reality devices have unprecedented capabilities to capture, process, store, and utilize data about an individual, including their physical movement patterns, cognitive state, and attention. Additionally, virtual worlds themselves significantly amplify the benefits and problems of today's social media, and require careful implementation of trust and safety systems, moderation techniques, and appropriate access permissions to ensure that users have a positive experience when they venture into these spaces.

As the web evolves and encompasses immersive content and spatial computing devices, it's important to think critically and carefully about the experiences being created, and interoperability across different applications. Ensuring that these virtual worlds are open, accessible, and safe to all is paramount. The prospect of the metaverse is an exciting one, and one that can only be realized through collaborative open source software movements.

Ensuring that virtual worlds are open, accessible, and safe to all is paramount to a successful metaverse.

Image by:

Image from Unsplash.com, Creative Commons Zero 

Gaming What to read next How to build an open source metaverse This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

Share your Linux terminal with tmate

opensource.com - Tue, 06/14/2022 - 15:00
Share your Linux terminal with tmate Sumantro Mukherjee Tue, 06/14/2022 - 03:00 1 reader likes this 1 reader likes this

As a member of the Fedora Linux QA team, I sometimes find myself executing a bunch of commands that I want to broadcast to other developers. If you've ever used a terminal multiplexer like tmux or GNU Screen, you might think that that's a relatively easy task. But not all of the people I want to see my demonstration are connecting to my terminal session from a laptop or desktop. Some might have casually opened it from their phone browser—which they can readily do because I use tmate.

Linux terminal sharing with tmate

Watching someone else work in a Linux terminal is very educational. You can learn new commands, new workflows, or new ways to debug and automate. But it can be difficult to capture what you're seeing so you can try it yourself later. You might resort to taking screenshots or a screen recording of a shared terminal session so you can type out each command later. The only other option is for the person demonstrating the commands to record the session using a tool like Asciinema or script and scriptreplay.

But with tmate, a user can share a terminal either in read-only mode or over SSH. Both the SSH and the read-only session can be accessed through a terminal or as an HTML webpage.

I use read-only mode when I'm onboarding people for the Fedora QA team because I need to run commands and show the output, but with tmate, folks can keep notes by copying and pasting from their browser to a text editor.

Linux tmate in action

On Linux, you can install tmate with your package manager. For instance, on Fedora:

$ sudo dnf install tmate

On Debian and similar distributions:

$ sudo apt install tmate

On macOS, you can install it using Homebrew or MacPorts. If you need instructions for other Linux distributions, refer to the install guide.

Image by:

(Sumantro Mukherjee, CC BY-SA 4.0)

Once installed, start tmate:

$ tmate

When tmate launches, links are generated to provide access to your terminal session over HTTP and SSH. Each protocol features a read-only option as well as a reverse SSH session.

Here's what a web session looks like:

Image by:

(Sumantro Mukherjee, CC BY-SA 4.0)

Tmate's web console is HTML5, so, as a result, a user can copy the entire screen and paste it into a terminal to run the same commands.

More Linux resources Linux commands cheat sheet Advanced Linux commands cheat sheet Free online course: RHEL technical overview Linux networking cheat sheet SELinux cheat sheet Linux common commands cheat sheet What are Linux containers? Our latest Linux articles Keeping a session alive

You may wonder what happens if you accidentally close your terminal. You may also wonder about sharing your terminal with a different console application. After all, tmate is a multiplexer, so it should be able to keep sessions alive, detach and re-attach to a session, and so on.

And of course, that's exactly what tmate can do. If you've ever used tmux, this is probably pretty familiar.

$ tmate -F -n web new-session vi  console

This command opens up new-session in Vi, and the -F option ensures that the session re-spawns even when closed.

Image by:

(Sumantro Mukherjee, CC BY-SA 4.0)

Social multiplexing

Tmate gives you the freedom of tmux or GNU Screen plus the ability to share your sessions with others. It's a valuable tool for teaching other users how to use a terminal, demonstrating the function of a new command, or debugging unexpected behavior. It's open source, so give it a try!

Tmate expands your options for session sharing with the Linux terminal.

Image by:

iradaturrahmat via Pixabay, CC0

Linux What to read next How tmux sparks joy in your Linux terminal This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

How to Install AlmaLinux 9 Step by Step

Tecmint - Tue, 06/14/2022 - 14:36
The post How to Install AlmaLinux 9 Step by Step first appeared on Tecmint: Linux Howtos, Tutorials & Guides .

AlmaLinux is a free and open-source community-driven operating system developed as a perfect alternative to CentOS 8 which RedHat discontinued in favor of CentOS Stream. It is 1:1 binary compatible with RHEL and is

The post How to Install AlmaLinux 9 Step by Step first appeared on Tecmint: Linux Howtos, Tutorials & Guides.

AMD Linux CPU Temperature Driver Sees Latest Patches For Zen 4 & Likely Mendocino

Phoronix - Tue, 06/14/2022 - 07:13
One of my personal gripes with AMD's Zen CPU support on Linux has been the lack of timely support for CPU temperature monitoring with their "k10temp" driver. Even though usually just new IDs are often needed and sometimes needing to adjust offsets or other minor changes, it has traditionally been done post-launch and sometimes left up to patches from the open-source community. Thankfully that has been changing and with Zen 4 it looks like that support will be ready for launch-day with the mainline Linux kernel...

Proposed SIG Could Help Fedora Linux Become A Leader For Heterogeneous Computing

Phoronix - Tue, 06/14/2022 - 02:00
To organize efforts around improving Fedora Linux for heterogeneous computing, a new special interest group "SIG" is looking to be established to help ensure the success of Fedora in the world of XPUs, the growing and very diverse software ecosystem around accelerators, etc...

LUMI Inaugurated As Europe's Most Powerful Supercomputer - Powered By AMD CPUs/GPUs

Phoronix - Tue, 06/14/2022 - 01:22
While not record-shattering like the 1.1 Exaflops Frontier supercomputer at ORNL that took the Top500 spot this year from Fugaku, LUMI was inaugurated today with the claim of Europe's most powerful supercomputer...

Pages