Open-source News

NFTs should be green, too

The Linux Foundation - Mon, 04/25/2022 - 21:00

Non-Fungible Tokens (NFTs) are an invention unique in human history whose role is fast extending beyond the speculative trends around collectibles to use cases that have a positive social impact. 

Through NFTs, a broad range of physical and virtual assets can be authenticated, providing transparency on ownership and underlying attributes of tokenized assets while preserving the privacy of individual owners. The cryptographic guarantees of NFTs make them well suited for use cases such as anti-counterfeiting, provenance tracking, and title transfer.

However, due to the high level of computational power required to mint an NFT on Proof of Work (PoW) blockchains, and the energy required to achieve the necessary computational power — which is primarily supplied by non-renewable fuel sources — the emissions from minting, transferring, and burning NFTs can be quite high

It’s estimated that the mining activities associated with cryptocurrencies emit as much as 114.06 megatons of CO2 per year, equivalent to the same amount produced by the entire Czech Republic.

Most of this effect is caused by electricity usage, as blockchain networks are frequently energy-intensive due to their PoW consensus mechanisms. Based on current patterns, blockchain technology will account for 1% of global electricity consumption by 2025. However, not all digital assets qualify as energy-intensive.

In a new study, Linux Foundation Research and Hyperledger Foundation collaborated with Palm NFT Studio to conduct a study on the design architecture of NFTs and how they may have varying carbon footprints depending on their underlying technology stacks. In essence, not all blockchains are equally hazardous to the environment.

Download Report

The report also provides recommendations for how NFT creators can reduce the environmental impact of their work, such as by using an alternative consensus mechanism that is not carbon-intensive. Those mechanisms need to be robust enough to:

  • Reduce blockchain’s carbon footprint
  • Protect against coordinated blockchain attacks by increasingly consolidated mining computing power
  • Overcome blockchain scaling challenges, which are limited by both slow finality times and low volumes of transactions per second (on Ethereum and many other blockchains)

One such alternative in use today is the Proof of Stake (PoS) consensus mechanism, which is less computationally intensive than PoW, among others. Rather than calculating to solve computational issues, in a PoS system, those in control of the blockchain’s upkeep stake (i.e., “pledge”) their currency, putting it in a type of escrow as a guarantee against fraud. If everything goes well, those who stake their tokens may earn a little profit through a share in block rewards. 

While we believe that a move to more environmentally-friendly NFTs by using alternative consensus mechanisms is an essential first step, it is not the only one needed to make the industry more sustainable. Sustainable practices for NFTs (and for the blockchain industry as a whole) start with reduction. Using renewable energy sources, such as solar and wind, can further reduce blockchain emissions. 

Beyond choosing sustainable blockchain architecture for issuing NFTs, carbon offsets are an important add-on to the sustainability equation. Offset projects can include a wide range of activities, from planting new forests to capturing methane gas from landfills. 

Measured, verified, and certified offsets allow a price to be placed on more carbon-intensive activities providing companies and businesses with a way to incorporate these into their budgets. While embracing offset projects can lead to greenwashing claims, it’s important to choose certified initiatives in tandem with other efforts.

NFTs are here to stay, so now is the time for the industry to reduce its carbon footprint and become more sustainable by leveraging existing technologies and carbon offset opportunities. We hope this report serves as a starting point to inform such decisions.

Subscribe to LF Research

The post NFTs should be green, too appeared first on Linux Foundation.

AMD Ryzen 7 5800X3D On Linux: Not For Gaming, But Very Exciting For Other Workloads

Phoronix - Mon, 04/25/2022 - 19:28
Last week AMD began shipping the much anticipated Ryzen 7 5800X3D as their first 3D V-Cache consumer CPU and their claims to be "the world's fastest PC gaming processor" in being able to outperform even the Core i9 12900K / 12900KS for Windows gaming. We weren't seeded by AMD for this launch, leading us to anticipate that it's not too good for Linux gaming / not their target market. But after the great success I've had with AMD Milan-X performance on Linux, I was very eager to try out this consumer CPU with the 3D-stacked L3 cache and ended up purchasing a 5800X3D. Indeed the Ryzen 7 5800X3D turned out to be disappointing for Linux gaming performance but the 5800X3D was very interesting for a range of other technical workloads and making me very excited for future Ryzen CPUs with 3D V-Cache.

Zstd Compressed Firmware Will Finally Be Supported With Linux 5.19

Phoronix - Mon, 04/25/2022 - 18:23
For two years there has been interest and unmerged patches for allowing Linux's plethora of firmware blobs to be Zstd-compressed for helping to save disk space. Finally it looks like for Linux 5.19 that optional Zstd firmware compression support will be merged...

Linux 5.19 To Upstream Driver For Raspberry Pi's Sense HAT Joystick

Phoronix - Mon, 04/25/2022 - 17:56
Queued up into the input subsystem's for-next branch ahead of Linux 5.19 is a new driver for supporting the Raspberry Pi Sense HAT Joystick...

NVIDIA Working On VFIO Power Management Improvements For Linux

Phoronix - Mon, 04/25/2022 - 17:42
A NVIDIA engineer is working on addressing the currently "very limited" power management support available with the Linux kernel's upstream VFIO PCI driver...

Wolfire Games Releases Overgrowth Game As Open-Source

Phoronix - Mon, 04/25/2022 - 17:14
Open-source friendly game studio Wolfire Games has released their Overgrowth title, which was released back in 2017 and the sequel to the Lugaru game, as open-source software...

New open source tool catalogs African language resources

opensource.com - Mon, 04/25/2022 - 15:00
New open source tool catalogs African language resources Chris Emezue Mon, 04/25/2022 - 03:00 Up Register or Login to like.

The last few months have been full of activity at Lanfrica, and we are happy to announce that Lanfrica has been officially launched.

More great content Free online course: RHEL technical overview Learn advanced Linux commands Download cheat sheets Find an open source alternative Explore open source resources What is Lanfrica?

Lanfrica aims to mitigate the difficulty encountered when seeking African language resources by creating a centralized, language-first catalog.

For instance, if you're looking for resources such as linguistic datasets or research papers in a particular African language, Lanfrica will point you to sources on the web with resources in the desired language. If those resources do not exist, we adopt a participatory approach by allowing you to contribute papers or datasets.

Image by:

(Chris Emezue, CC BY-SA 4.0)

At Lanfrica, we employ a language-focused approach. With 2,199 African languages accounted for, our language section boasts of all the African languages—yes, all of them, including the extinct ones! We have created algorithms that can tell, with much effectiveness, the African language(s) involved in a resource, enabling us to curate even works that do not explicitly specify the African languages they worked on (and there are many).

Lanfrica offers enormous potential for better discoverability and representation of African languages on the web. Lanfrica can provide useful statistics on the progress of African languages. As a simple illustration, the language filter section offers an immediate overview of the number of existing natural language processing (NLP) resources for each African language.

Image by:

(Chris Emezue, CC BY-SA 4.0)

From this search result, you can easily see that among South African languages, Afrikaans has 28 NLP resources, while Swati has just eight. Or, to take another example, the Gbe cluster languages of Benin have far fewer NLP resources than some of the South African languages.

Image by:

(Chris Emezue, CC BY-SA 4.0)

Such insight can lead to better allocation of funds and efforts towards bringing the more under-researched languages forward in NLP, thereby fostering the equal progress of African languages.

Lanfrica v1 is just the beginning. We have major updates coming up in the future:

  • We plan to enable our users to sign up and add to or edit the resources on Lanfrica.

  • Our current resources currently consist of NLP datasets. Next, we plan to work on publications in computational linguistics and linguistic publications. See the infographic above for all the types of resources planned for inclusion.

  • We are exploring various techniques to simplify the process through which relevant resources are identified and connected to Lanfrica.

For more updates as we move forward, become part of the Lanfrica community by joining our Slack or following us on Twitter.

This article originally appeared on the Lanfrica blog and is republished with permission.

Lanfrica enables research on any of the current and extinct languages from the African continent.

Image by:

Geralt. CC0.

Tools Accessibility What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

Prevent Kubernetes misconfigurations during development with this open source tool

opensource.com - Mon, 04/25/2022 - 15:00
Prevent Kubernetes misconfigurations during development with this open source tool Noaa Barki Mon, 04/25/2022 - 03:00 Up Register or Login to like.

I'm a developer by nature, but I've been doing a lot of DevOps work lately, especially with Kubernetes. As part of my work, I've helped develop a tool called datree with the aim of preventing Kubernetes misconfiguration from reaching production. Ideally, it helps empower collaboration and fosters a DevOps culture in your organization for the benefit of people like me, who don't always think in DevOps.

More on Kubernetes What is Kubernetes? Free online course: Containers, Kubernetes and Red Hat OpenShift technical over… eBook: Storage Patterns for Kubernetes Test drive OpenShift hands-on An introduction to enterprise Kubernetes How to explain Kubernetes in plain terms eBook: Running Kubernetes on your Raspberry Pi homelab Kubernetes cheat sheet eBook: A guide to Kubernetes for SREs and sysadmins Latest Kubernetes articles A common scenario

The following scenario demonstrates a problem faced by many tech companies:

  • At 3:46AM on a Friday, Bob wakes up to the sound of something falling onto his bedroom floor. It's his phone, showing 15 missed calls from work.
  • Apparently, Bob had forgotten to add a memory limit in a deployment, which caused a memory leak in one of the containers, which led all Kubernetes nodes to run out of memory. 
  • He's supremely embarrassed about this, especially because the DevOps team had put so much effort into educating developers like him about Kubernetes and the importance of a memory limit.

How could this happen? Well, imagine that Bob works at Unicorn Rentals. Like many companies, they started as a tiny founding team of two developers, a CEO, and a CTO. Things were slow at first, but eventually everybody wanted to rent a unicorn, and when that happened, the company couldn't afford production outages.

A series of accidents like the one that woke Bob up at 3:46AM led the company to realize that something had to change.

If that mirrors scenarios in your own organization, then it could be that something needs to change for you, too.

The problem: scaling security policies

To avoid uncomfortable development issues and significant bugs in production, you need to educate your developers. They need to know about Kubernetes, how it works, how to develop it, and what they can do with it.

You also need to define policies so that if a resource doesn't match certain specifications on time, it doesn't enter the cluster. But what happens when there are hundreds of repos? How are those policies managed at scale? How can procedures be monitored and reviewed?

Datree is an open source command-line solution that enables Kubernetes admins to create policies and best practices they want the team to follow.

Datree allows admins to: 

  • Enforce policy restrictions on development: Enforce restrictions before applying resources to the cluster.
  • Enable restrictions management: Flexible management of restrictions in a dedicated place across the entire organization empowers administrators to control their systems fully.
  • Educate about best practices: Liberate DevOps from the constant need to review, fence, and future-proof all possible pitfalls on all current and future use cases which are part of the self-deployment. 
Why Datree?

Datree aims to help admins gain maximum production stability with minimum time and effort by enforcing policies before misconfigured resources reach production. 

  • Education and best practices insurance: The CLI application simplifies Kubernetes deployment experience, so developers don't need to remember any rules governing development. DevOps developers are no longer forming a bottleneck. Datree's CLI application comes with Kubernetes best practices built-in, so there's no need to rely on human observation and memory. 
  • Enforcement on development: Developers are alerted early, as soon as a misconfiguration occurs in the PR. This way, they can catch mistakes before their code moves to production/collaborative environments.
  • DevOps culture: Datree provides a mechanism similar to other development tools like unit tests. This makes it easier for developers because they are already used to these tools. Testing is the most common activity that developers carry out. Using familiar tools can be a great foundation for cultivating a DevOps culture.
How Datree works

The datree command runs automatic checks on every resource that exists in a given path. These automatic checks include three main validation types: 

  1. YAML validation
  2. Kubernetes schema validation
  3. Kubernetes policies validations
$ datree test ~/.datree/k8s-demo.yaml >> File: .datree/k8s-demo.yaml
[V] YAML validation
[V] Kubernetes schema validation
[X] Policy check

X Ensure each container image has a pinned (tag) version [1 occurrence]
  - metadata.name: rss-site (kind: Deployment)
!! Incorrect value for key `image` - specify an image version to avoid unpleasant "version surprises" in the future

X Ensure each container has a configured memory limit [1 occurrence]
  - metadata.name: rss-site (kind: Deployment)
!! Missing property object 'limits.memory' - value should be within the accepted boundaries recommended by the organization

X Ensure workload has valid Label values [1 occurrence]
  - metadata.name: rss-site (kind: Deployment)
!!  Incorrect value for key(s) under 'labels - the vales syntax is not valid so the Kubernetes engine will not accept it

X Ensure each container has a configured liveness probe [1 occurrence]
 - metadata.name: rss-site (kind: Deployment)
!! Missing property object 'livenessProbe - add a properly configured livenessProbe to catch possible deadlocks

[...]

After the check is complete, Datree displays a detailed output of any violation or misconfiguration that it finds, which guides developers to fix the issue. You can run the command locally, but it's specially designed to run during continuous integration (CI) or even earlier as a pre-commit hook (yes, without losing any explanation for reasons behind the policy).

Along with the command-line application, Datree enables complete management of policies using the UI, like creating new customized policies, reviewing the full history of the invocations, and more.

Image by:

(Noaa Barki, CC BY-SA 4.0)

How I've embraced the DevOps mindset

As a front-end full stack developer, I got trained to think solely about code, and I have always found DevOps technologies and thought processes to be a mystery. But recently, I was challenged to develop a CLI application at Datree and began to understand the importance and functionality of DevOps.

My mantra is, "Our job as developers isn't about coding—it's about solving real-life problems." When I started working on datree, I had to understand more than just the real-life problem. I also had to know how it became a problem in the first place. Why do organizations adopt Kubernetes? What's the role of the DevOps engineer? And most of all, for whom am I developing my application?

Now I can honestly say that through developing datree, I entered the world of Kubernetes and learned that the best way to learn Kubernetes is by embracing DevOps culture. Developing the datree command has taught me the importance of understanding my user persona. More importantly, it helped me gain fundamental knowledge about the ecosystem of an application and understand the product and user journey.

Summary

When Kubernetes is adopted, the culture of your development environment changes. DevOps isn't something that happens overnight, especially in a large organization. This transition can be aided with technology that helps developers catch their own mistakes and learn from them in the future. 

With Datree, the gap between DevOps and developers has begun to shrink. Even diehard coders like me have started to take ownership of limitation policies. The code sent to production is of higher quality, saving time and preventing embarrassing mistakes.

Datree is an open source command that enables Kubernetes admins to create policies and best practices they want the team to follow.

Kubernetes DevOps Command line What to read next What Kubernetes taught me about development Implement governance on your Kubernetes cluster What you need to know about security policies This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

Pages