Open-source News

Radeon Software For Linux 22.10 Driver Being Prepared For Release

Phoronix - Thu, 03/31/2022 - 17:50
At the moment the Radeon Software for Linux 21.50.2 is AMD's latest packaged graphics driver intended for enterprise Linux distributions. But Radeon Software for Linux 22.10 should soon be announced and can already be fetched from their package archive...

Apple M1 NVMe Linux Driver Out For Review

Phoronix - Thu, 03/31/2022 - 17:28
Sent out last week amid the busy Linux 5.18 merge window days were the patch series wiring up an Apple NVMe driver for use with the M1, M1 Pro, and M1 Max SoCs...

Fedora Looks To Better Onboarding For IoT/Edge Devices

Phoronix - Thu, 03/31/2022 - 17:08
For Fedora 37 later this year the Linux distribution is looking at providing support for zero touch onboarding for IoT / edge devices...

How I customize my Linux window decorations

opensource.com - Thu, 03/31/2022 - 15:00
How I customize my Linux window decorations David Both Thu, 03/31/2022 - 03:00 Up Register or Login to like.

One thing I especially like about Linux is the amazing and vast array of choices in almost everything. Don't like one application for something? There are usually several more you can choose from. Don't like how the desktop works? Pick one of many other desktops. Don't like the window decorations on your desktop? There are many others you can download and try.

What if you don't like one little thing about your choice of window decorations—and all other sets of decorations are not even close?

One of the advantages of open source is that I can change anything I want. So I did.

I use the Alienware-Bluish theme on my Xfce desktop. I like its futuristic look, the cyan and gray colors that match my dark primary color schemes—and sometimes my moods. It has a nice 3D relief in the corners, and the corners and edges are wide enough to grab easily, even with my Hi-DPI resolution. Figure 1 shows the original Alienware-Bluish decorations with the gradient-black-324.0 color scheme I prefer.

Image by:

Figure 1. An active window (with the focus) using the original Alienware-Bluish decorations (David Both, CC BY-SA 4.0)

Two things bother me about this window. First, the intensity of the window name in the title bar for active windows is just too dull for me. The inactive windows have a bright white title that attracts my eye more than the dull cyan color of the active title.

Second, I like dark wallpapers, as you can see in Figure 1. Because the bottom edge of the window does not have a cyan highlight, it can be difficult to determine where the bottom of the windows are located, especially when there are a lot of overlapping windows open.

Pretty minor annoyances, I know, but they just bothered me. And that is one of the coolest things about open source: I can modify anything I want, even for trivial reasons. It just takes a bit of knowledge, which I am sharing with you.

More Linux resources Linux commands cheat sheet Advanced Linux commands cheat sheet Free online course: RHEL technical overview Linux networking cheat sheet SELinux cheat sheet Linux common commands cheat sheet What are Linux containers? Our latest Linux articles Where are the decoration files?

The first thing I needed to do was locate the files for the decorations I am using, Alienware-Bluish. I know this because of the many decorations I have downloaded over the years.

All of the decorative themes I download are located in the /usr/share/themes/ directory so all users will have access to them. Each theme is located in a subdirectory, so the Alienware-Bluish theme is located in the /usr/share/themes/Alienware-Bluish/xfwm4/ directory. The xfwm stands for xf window manager version 4.

If you install your themes in your home directory, they will be located in the ~/.local/share/themes/Alienware-Bluish/xfwm4 directory. Themes stored in your home directory are not available to other users on your computer.

Preparation

I don't like to work on the original files for anything important like a theme, so I used my own non-root account to copy the /usr/share/themes/Alienware-Bluish directory and its contents to a new directory, /usr/share/themes/Alienware-Bluish-2. This gives me a safe place to work without inadvertently damaging the original beyond repair. It also copies files and changes the ownership of the copied files to my own account, so I can copy and edit the files.

Besides, I want to keep the original so I can continue to use it.

Getting started

View the files in the /usr/share/themes/Alienware-Bluish-2/xfce directory using Thunar or another file manager that lets you view image thumbnails, then zoom in to increase the size of the images. Expand the images so you can see them better. Each *.xpm (X11 Pixmap) file is an image of a small window frame section, as you can see in Figure 2.

Image by:

Figure 2: The files that make up the various segments of a window (David Both, CC BY-SA 4.0)

Notice that the different components each have an active and an inactive version. In the case of this theme, they are mostly the same. I now own these copied files, so I can copy and edit them.

Look especially at the bottom-active.xpm and bottom-inactive.xpm files. These are the two files that define the look of the bottom of the window. These two images are only one pixel wide, so they are essentially invisible in Figure 2. The window manager uses as many instances as necessary to create the bottom edge of the window.

Themes for other desktops may use different file formats.

Making the changes

First, I changed the title color. The themerc file contains text configuration data that defines several aspects of the title bar. This file is an ASCII text file. Here is the content for the theme:

full_width_title=true
title_alignment=center
button_spacing=2
button_offset=30
button_layout=S|HMC
active_text_color=#699eb4
inactive_text_color=#ffffff
title_vertical_offset_active=5
title_vertical_offset_inactive=5

The hex numbers in the text color entries define the colors for active and inactive title text. To change the active title text, I need to determine what value to use in this field. Fortunately, there is a tool that can help. The KcolorChooser can be used to select a color from the color palette, or the Pick Screen Color button can be used to choose a color already displayed on the screen.

I used this color picker to locate the cyan highlight in the side of the window, but I found it just a little too bright for the bottom. I wanted it a bit less bright, so I used the tools on the KcolorChooser to adjust the color and intensity to my preference. You can see the result in Figure 3.

Image by:

Figure 3. Using the KcolorChooser to select a specific color (David Both, CC BY-SA 4.0)

The KcolorChooser can be installed if you don't have it already. On Fedora and other Red Hat-based distros, you can use the following command:

dnf -y install kcolorchooser

If you don't already have the KDE desktop or any of its tools installed, this command will install a large number of KDE libraries and other dependencies. It was already installed on my workstation because I have the KDE Plasma desktop installed.

After deciding which color I wanted, I obtained the hex digits for that color from the HTML text box. I then typed those into the themerc file so the active_text_color line looks like this:

active_text_color=#00f1f1

The next part, changing the bottom-active.xpm image file, is a little more complicated. I used GIMP to modify the bottom-active.xpm file, but you can use any graphics editor you are comfortable with. One catch: the image is so small that it needs to be enlarged by a huge amount to be a reasonable size for editing. I found that 8,000% worked well on my display. You can see this in Figure 4. This image is 6 pixels high by 1 pixel wide and black and shades of dark gray.

Image by:

Figure 4. The bottom-active.xpm file shown at 8,000% magnification in GIMP (David Both, CC BY-SA 4.0)

I used the KcolorChooser to find a shade of cyan a little darker than that on the side and top edges of the window. After some playing around with it, I settled on the shade #10b0ae, which I then copied into the text field of the GIMP colors dialog. I had to add this dialog to the dock area at the upper right of the GIMP window by selecting Menu Bar Tools > Dockable Dialogs > Colors. Alternatively, I could have used the color picker, the eye-dropper icon, in the GIMP Colors dialog to simply pick the color from the sample display area of the KcolorChooser.

At any rate, I now had the color I liked in the GIMP color dialog. I used the Rectangle Select tool to select the 3 pixels highlighted in Figure 5 and the Bucket Fill tool to fill the selected area with the new color. Figure 5 shows the final color.

Image by:

Figure 5. The modified bottom-active.xpm file with the addition of cyan (David Both, CC BY-SA 4.0)

Exporting the revised file

GIMP converted the .xpm file into a data format it could use, but it can't save the data directly into a .xpm file. Instead, I used the export function to save the file. This was not a big deal, but a bit unexpected the first time.

During the export, I was presented with a dialog asking for an Alpha Threshold value. I don't know enough about GIMP or manipulating graphics files to know what that is, so I left it alone and clicked on the Export button.

Testing

The changes I made to this theme are easy to test. I simply used the Window Manager to select the Alienware-Bluish-2 theme. This loads the new theme instantly, so I can see the results right away.

Had I not liked the results, I could have made additional changes and tested again. At this point, however, I would have had to change back to the original Alienware-Bluish theme (or any other theme) and then back to the Alienware-Bluish-2 theme to verify the change. The revised files are not loaded until the theme is re-read.

Figure 6 shows the revised theme using the cyan highlights in the bottom window edge. I think it looks much better.

Image by:

Figure 6. A window showing the altered bottom edge (David Both, CC BY-SA 4.0)

Final thoughts

I had no idea how to fix minor problems and annoyances with window decorations until I started this little project. It did take some time and research to figure out how to do this. I learned there is an xpm graphics format, and I learned a little more about working in GIMP, including how to export into that file format. I also discovered this was a fairly easy change to make.

I still don't feel I have the skill or creative vision for graphics to design a completely new window decoration theme. But now I can easily make minor changes to themes someone else has created.

I can make minor modifications to the Alienware-Bluish theme on my Xfce desktop to suit my aesthetic.

Image by:

Pixabay. CC0 Creative Commons

Linux What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

Installation and Review of Bodhi Linux [Lightweight Distro]

Tecmint - Thu, 03/31/2022 - 12:00
The post Installation and Review of Bodhi Linux [Lightweight Distro] first appeared on Tecmint: Linux Howtos, Tutorials & Guides .

Bodhi Linux is a new Linux distribution, based on Ubuntu Linux – which in turn is based on Debian – that aims to be lightweight and free. As an Ubuntu-based system, we have the

The post Installation and Review of Bodhi Linux [Lightweight Distro] first appeared on Tecmint: Linux Howtos, Tutorials & Guides.

Secure software supply chains: good practices, at scale

The Linux Foundation - Thu, 03/31/2022 - 03:37

Here at The Linux Foundation’s blog, we share content from our projects, such as this article from the Cloud Native Computing Foundation’s blog. The guest post was originally published on Contino Engineering’s blog by Dan Chernoff. 

Supply chain attacks rose by 42% in the first quarter of 2021 [1] and are becoming even more prevalent [2]. In response to secure software supply chain breaches like Solar Winds [3], Kaseya[4], and other less publicized compromises [5], the Biden administration issued an executive order that includes guidance designed to improve the federal government’s defense against cyber threats. With all of this comes the inevitable slew of blog posts that detail a software supply chain and how you would protect it. The Cloud Native Computing Foundation recently released a white paper regarding software supply chain security [7], an excellent summary of the current best practices for securing your software supply chain.

The genesis for the content in this article is work done to implement secure supply chain patterns and practices for a Contino customer. The core goals for the effort were; implement a pipeline agnostic solution that ensures the security of the pipelines and enables secure delivery for the enterprise. We’ll talk a little about why we chose the tools we did in each section and how they supported the end goal.

As we start our journey, we’ll first touch on what a secure software supply chain is and why you should have one to set the context for the rest of the blog post. But let’s assume that you have already decided that your software supply chains need to be secure, and you want to implement the capability for your enterprise. So let’s get into it!

Anteing Up

Before you embark upon the quest of establishing provenance for your software at scale, there are some table stakes elements that teams should already have in place. We won’t delve deeply into any of them here other than to list and briefly describe them.

Centralized Source Control, Git is by far the most popular choice. This ensures a single source of truth for development teams. Beyond just having source control, teams should also implement the signing of their Git commits.

Static Code Analysis. This identifies possible vulnerabilities within ‘static’ (non-running) source code by using techniques such as Taint Analysis and Data Flow Analysis. Analysis and results need to be incorporated into the cadence of development.

Vulnerability Scanning. Implement automated tools that scan the applications and containers that are built to identify potential vulnerabilities in the compiled and sometimes running applications.

Linting is a tool that analyzes source code to flag programming errors, bugs, and stylistic errors. Linting is important to reduce errors and improve the overall code quality. This in turn accelerates development.

CI/CD Pipelines. New code changes are automatically built, tested, versioned, and delivered to an artifact repository. A pipeline then automatically deploys the updated applications into your environments (e.g. test, staging, production, etc.).

Artifact Repositories. Provide management of the artifacts built by your CI/CD systems. An artifact repository can help with the version and access control of your artifacts.

Infrastructure as Code (IaC) is the process of managing and provisioning infrastructure (e.g. virtual machines, databases, load balancers, etc.) through code. As with applications, IaC provides a single source of truth for what the infrastructure should look like. It also provides the ability to test before deploying to production.

Automated…well, everything. Human-in-the-loop systems are not deterministic. They are prone to error which can and will cause outages and security gaps. Manual systems also inhibit the ability of platforms to scale quickly.

What is a Secure Software Supply Chain

A software supply chain consists of anything that goes into the creation of your end software product and the mechanisms you use to deliver the product to customers. This includes things like your source code, your build systems, the 3rd party libraries, deployment infrastructure, or delivery repositories.

Attributes:

  • Establishes Provenance — One part of establishing provenance is ensuring that any artifact that is created and accessed by the customer should be able to trace its lineage all the way back to the developer(s) that merged the latest commit. The other part is the ability to demonstrate (or attest) that for each step in the process, the software, components, and other materials that go into creating the final product are tamper-free.
  • Trust — Downstream systems and users need a mechanism to verify that the software that is being installed or deployed came from your systems and that the version being used is the correct version. This ensures that malicious artifacts have not been substituted or that older, vulnerable versions have not been relabeled as the current version.
  • Transparent — It should be easy to see the results and details for all steps that go into the creation of the final artifact. This can include things like test results, output from vulnerability scans, etc.
Key Elements of a Secure Software Supply Chain

Let’s take a closer look at the things that need to be layered into your pipelines to establish provenance, enable transparency, and ensure tamper resistance.

Here is what a typical pipeline might look like that creates a containerized application. We’ll use this simple pipeline and add elements as we discuss them.


.avia-image-container.av-l1dqmmbb-689723b8d5b686538786f4c56b7a8755 .av-image-caption-overlay-center{ color:#ffffff; }

Establishing Provenance Using in-toto

The first step in our journey is to establish that artifacts built via a pipeline have not been tampered with and to do so in a reliable and repeatable way. As we mentioned earlier, part of this is creating evidence to use as part of the verification. in-toto is an open-source tool that creates a snapshot of the workspace where the pipeline step is running.

These snapshots (“link files” in the in-toto terminology) verify the integrity of the pipeline. The core idea behind in-toto is the concept of materials and products and how they flow, just like in a factory. Each step in the process usually has some material that will create its product. An example of the flow of materials and products is the build step. The build step uses code as the material, and the built artifact (jar, war, etc.) is the product. A later step in the pipeline will use the built artifact as the material and produce another product. In this way, in-toto allows you to chain the materials and products together and identify if a material has been tampered with during or between one of the pipeline steps. For example, if the artifact constructed during the build step changed before testing.


.avia-image-container.av-l1dqo942-ce042b92cdd53a3e9573982398112be6 .av-image-caption-overlay-center{ color:#ffffff; }

At the end of the pipeline, in-toto evaluates the link data (the attestation created at each step) against an in-toto layout (think Jenkins file for attestation) and verifies that all the steps were done correctly and by the approved people or systems. This verification can run anytime the product of the pipeline (container, war, etc.) needs to be verified.

Critical takeaways for establishing provenance

in-toto runs at every step of the process. The attestation compares to an overarching layout during verification. This process enables consumers (users and/or deployment systems) to have confidence that the artifacts built were not altered from start to finish.

Establishing Trust using TUF

You can use in-toto verification to know that the artifact was delivered or downloaded without modification. To do that, you will need to download the artifact(s), the in-toto link files used during the build, the in-toto layout, and the public keys to verify it all. That is a lot of work. An easier way is to sign the artifacts produced with a system that enables centralized trust. The most mature framework for doing so is TUF (The Update Framework).

TUF is a framework that gives consumers of artifacts guarantees that the artifact downloaded or automatically installed came from your systems and is the correct version. The guts of how to accomplish this are outside the scope of this blog post. The functionality we are interested in is verifying that an artifact came from the producer we expected and that the version is the expected version.

Implementing TUF on your own is a fair bit of work. Fortunately, an “out of the box” implementation of TUF is available for use, Docker Content Trust (a.k.a. Notary). Notary enables the signing of regular files as well as containers. In our example pipeline, we sign the container image during build time. This signing allows any downstream system or user to verify the authenticity of the container.


.avia-image-container.av-l1dxbjrl-df9d76a8410e7172b7693c99d9a45f31 .av-image-caption-overlay-center{ color:#ffffff; }

Transparency Centralized Data Storage

One of the gaps that in-toto has as a solution is a mechanism to persist the link data it creates. It is up to the team to implement in-toto to capture and store the link data somewhere. All the valuable metadata for each step can be captured and stored outside of the build system. The goal is twofold; the first is to store the link data outside the pipeline to enable teams to retrieve the link data and use it anytime verification needs to run on the artifacts produced from the pipeline. The second goal is to store the metadata around the build process outside the pipeline. That enables teams to implement visualizations, monitoring, metrics, and rules on the data produced from the pipeline without necessarily needing to keep it in the pipeline.

The Contino team created metadata capture tooling that is independent and agnostic of the pipeline. We chose to write a simple python tool that captures the meta and in-toto data and stores it in a database. If the CI/CD platform is reasonably standard, you can likely use built-in mechanisms to achieve the same results. For example, the Jenkins LogStash plugin can capture the output of a build step and persist data to an elastic datastore.


.avia-image-container.av-l1dxcyvo-49c15d2022e50e89ef3f04841c17955e .av-image-caption-overlay-center{ color:#ffffff; }

PGP and Signing Keys

A core component for in-toto and Notary are keys used to sign and verify link data and artifacts/containers. in-toto uses PGP private keys to sign the link data produced at each step internally. That signing ensures a relationship between the person or system that did the action and the link data. It also ensures that it can be easily detected if the link data gets altered or tampered with in any way.

Notary uses public and private keys generated using the Docker or Notary CLI. The public keys get stored in the notary database. The private keys sign the containers or other artifacts.

Scaling Up

For a small set of pipelines, manually implementing and managing secure software supply chain practices is straightforward. Management of an enterprise that has hundreds if not thousands of pipelines requires some additional automation.

Automate in-toto layout creation. As mentioned earlier, in-toto has a file akin to a Jenkins file that dictates what person or systems can complete a pipeline step, the material and product flow, and how to inspect/verify the final artifact(s). Embedded in this layout are the IDs for the PGP keys of the people or systems who can perform steps. Additionally, the layout is internally signed to ensure that any tampering can be detected once the layout gets created. To manage this at scale, the layouts need to be automatically created/re-created on demand. We approach this as a pipeline that automatically runs on changes to the code that creates layouts. The output of the pipeline is layouts, which are treated as artifacts themselves.

Treat in-toto layouts like artifacts. in-toto payouts are artifacts, just like containers, jars, etc. Layouts should be versioned, and the layout version linked to the version of the artifact. This versioning enables artifacts to be re-verified with the layout, link files, and relevant keys at artifact creation time.

Automate the creation of the signing keys. Signing keys that are used by autonomous systems should be rotated frequently and through automation. Doing this limits the likelihood for compromise of the signing keys used by in-toto and Notary. For in-toto, this frequent rotation will require the automatic re-creation of the in-toto layouts. For Notary, cycling the signing keys will require revocation of the old key when we implement the new key.

Store and use signing keys from a secret store. When generating signing keys for use by automated systems, storing the keys in secret management systems like Hashicorp’s Vault is an important practice. The automated system can retrieve the signing keys (e.g., Jenkins, GitLab ci, etc.) when needed. Centrally storing the signing keys combats “secrets sprawl” in an enterprise and enables easier management.

Pipelines should be roughly similar. A single in-toto layout can be used by many pipelines, as long as they operate in the same way. For example, pipelines that build a Java application that creates a WAR as the artifact probably operates in roughly the same way. These pipelines can all use the same layout if they are similar enough.

Wrapping it All Up

Using the technologies, patterns, and practices here the Contino team was able to deliver an MVP grade solution for the enterprise. The design will be able to scale up to thousands of application pipelines and help ensure software supply chain security for the enterprise.

At its core, a secure software supply chain encompasses anything that goes into building and delivering an application to the end customer. It is built on the foundations of secure software development practices (e.g. following OWASP top 10, SAST, etc.). Any implementation of secure supply chain best practices needs to establish provenance about all aspects of the build process, provide transparency for all steps and create mechanisms that ensure trustworthy delivery.

Sources:

[1] https://www.propertycasualty360.com/2021/04/13/supply-chain-attacks-rose-42-in-q1/?slreturn=20210726153708

[2] https://portswigger.net/daily-swig/four-fold-increase-in-software-supply-chain-attacks-predicted-in-2021-report

[3] https://www.npr.org/2021/04/16/985439655/a-worst-nightmare-cyberattack-the-untold-story-of-the-solarwinds-hack

[4] https://www.zdnet.com/article/updated-kaseya-ransomware-attack-faq-what-we-know-now/

[5] https://portswigger.net/daily-swig/researcher-hacks-apple-microsoft-and-other-major-tech-companies-in-novel-supply-chain-attack

[6] https://www.whitehouse.gov/briefing-room/presidential-actions/2021/05/12/executive-order-on-improving-the-nations-cybersecurity/

[7] https://github.com/cncf/tag-security/raw/main/supply-chain-security/supply-chain-security-paper/CNCF_SSCP_v1.pdf

The post Secure software supply chains: good practices, at scale appeared first on Linux Foundation.

Glibc's strncasecmp / strcasecmp Get AVX2 & EVEX Optimized Versions, Drops AVX

Phoronix - Thu, 03/31/2022 - 03:20
The GNU C Library (glibc) has landed a set of 23 patches providing optimized AVX2 and EVEX versions of strcasecmp/strncasecmp functions while dropping support for the original AVX implementation...

EXT4's Fast Commit Feature Faster & More Scalable With Linux 5.18

Phoronix - Thu, 03/31/2022 - 01:39
Last week the EXT4 file-system feature updates were submitted and merged for the ongoing Linux 5.18 merge window...

What's KernelCare? - Linux Journal

Google News - Thu, 03/31/2022 - 01:35
What's KernelCare?  Linux Journal

Intel Introduces The Arc A-Series Mobile Graphics

Phoronix - Wed, 03/30/2022 - 23:00
Intel today is formally introducing their Arc 3 series mobile graphics that will begin appearing in laptops beginning in April while Arc 5 and Arc 7 graphics are coming out in the "early summer" for the much anticipated Intel discrete graphics offerings.

Pages