Open-source News

Switch from Google Workspace to Nextcloud

opensource.com - Mon, 03/06/2023 - 16:00
Switch from Google Workspace to Nextcloud hej Mon, 03/06/2023 - 03:00

If you're wary of committing your data to cloud services controlled by a corporation but you love the convenience of remote storage and easy web-based access, then you're not alone. The cloud is popular because of what it can do. But the cloud doesn't have to be closed. Luckily, the open source Nextcloud project provides a personal and private cloud application suite.

It's easy to install and import data—including contacts, calendars, and photos. The real trick is getting your data from cloud providers like Google. In this article I demonstrate the steps you need to take to migrate your digital life from an Android device to Nextcloud.

Migrate your data to Nextcloud

I wrote this article using a Raspberry Pi running Nextcloud, but the process is the same regardless of how you choose to run Nextcloud.

Two network protocols are used to exchange data between Nextcloud and your Android device: CalDAV (calendar) and CardDAV (contacts). Android doesn't natively support these two protocols. You need an additional app for Android smartphones and tablets. DAVx⁵ synchronizes calendars and contacts between Android devices and Nextcloud. It is available for download free of charge as an APK from F-Droid, from the Google Play Store (approximately US$6), and other app stores.

Before you set up synchronization with the app, you need to export existing contacts and calendars and import them into Nextcloud:

  1. Log into your Google account and make sure the automatic synchronization of contacts and appointments with Google is switched On to export the data. This ensures that your data is up-to-date in the Google Cloud.

  2. Open the Google Apps menu and select the Contacts entry to open the address book. In the left sidebar, click on Export. In the next dialogue, select all contacts and save everything as a vCard (.vcf file). If you only want to export certain address book entries, select them beforehand and choose Selected contacts in the Export dialogue.

  3. Import the .vcf file into Nextcloud. To do this, select the Contacts app, click Settings at the bottom left, and click the Import contacts button. In the following dialogue window, click Select local file, and open the previously saved vCard.

It's just as simple to export and then import your calendars:

  1. Visit the Google website and open the Calendar app. On the left, you see all your own and subscribed calendars (My calendars). The right side displays the day, week or month views. Click on the icon with the gears to access the settings.

  2. On the left, click Import and export. By default, all calendars are marked for export. There is no option to save only individual calendars.

  3. Click Export and save the .zip file to the hard disk. Unpack the .zip file. It contains several .ics files (iCalendar format), one for each Google calendar.

  4. Open the Calendar app in Nextcloud, click Calendar settings at the bottom left and then Import calendar. Select one or more .ics files you saved in the file manager.

More open source alternatives Open source project management tools Trello alternatives Linux video editors Open source alternatives to Photoshop List of open source alternatives Latest articles about open source alternatives

After a short time, all events appear in your Nextcloud calendar. Repeat this process for all Google calendars. Now everything is ready to replace the old Google synchronization service. From now on, you create new entries through your Android device.

Connecting to your Nextcloud account

Install the DAVx⁵ app and confirm it has access to your calendars and contacts. Tap the orange plus sign to connect the Nextcloud account to the app. Select Login with URL and user name, enter your Nextcloud user name and password. The Base URL field contains an address that you can find in Nextcloud's calendar app. Click on Calendar settings at the bottom left and scroll down to Copy primary CalDAV address.

Click Login to connect to your Nextcloud, Create account, and select Groups are separate vCards from the Contact group method drop-down menu. You can use the sliders in the CardDAV and CalDAV sections to define the contacts and address books you want the app to synchronize.

Once the app has completed the synchronization process, you can access the data through standard Android apps for calendars and address books. You can start the synchronization manually with the Refresh icon at the bottom right. You can also set the app to always run in the background. You can also set the app to always run in the background. Note that this may affect the battery life of the device.

This article has been adapted from Heike Jurzik's book, Nextcloud on the Raspberry Pi.

How to synchronize your data between Android and Nextcloud.

Image by:

Opensource.com

Nextcloud Alternatives Android What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

How Wikipedia helps keep the internet open

opensource.com - Mon, 03/06/2023 - 16:00
How Wikipedia helps keep the internet open srish_aka_tux Mon, 03/06/2023 - 03:00

Wikipedia is one of the most significant open source software projects, in part because it's a lot bigger than you may realize. And yet anyone can contribute content, and anyone can contribute code to many technical areas of the projects that work behind the curtain to keep Wikipedia running.

Over 870 Wikipedia and umbrella sites are available in different languages, and all of them operate with a common goal of “developing free educational content and disseminating it effectively and globally.” For example, Wikimedia Commons is a repository of free media files, and as of today, it has over 68 million images. Wikisource is a free library of textual sources with over 5 million articles and website subdomains active for 72 languages. Wikidata is an accessible repository of over 99 million data items used across several Wikipedia-related sites.

These projects are supported and maintained by Wikimedia Foundation, a non-profit organization headquartered in San Francisco. The organization also empowers hundreds of thousands of volunteers worldwide to contribute free knowledge to these projects. Behind this community of knowledge gatherers and producers, a lot of work goes into maintenance, technical support, and administrative work to keep these sites up and running. From the outside looking in, you might still wonder what more work could remain in developing Wikipedia’s software. After all, it’s one of the top ten most visited internet websites in the world, and serves its purpose well and provides access to the best possible information.

The truth is that every article on Wikipedia leverages thousands of software tools for its creation, editing, and maintenance. These are crucial steps in ensuring equitable, reliable, and fast access to information no matter where you are in the world. When you browse Wikipedia or any other Wikimedia sites, the software you interact with is called MediaWiki, a powerful collaboration and documentation software that powers the content of Wikipedia. It comes with a default set of features. To further enhance the software’s capabilities, you can install various extensions. They’re too numerous to mention, but two notable extensions are:

  • VisualEditor: A WYSIWYG rich-text editor for MediaWiki-powered wikis
  • Wikibase: Allows storing, managing, and accessing structured data which Wikipedia pulls from Wikidata.

All of this apparent ancillary tooling makes the modern Wikipedia, and each one is important for its functioning.

Wikimedia and Mediawiki

Overall, Wikipedia’s technology ecosystem is vast! As MediaWiki, one of the most popular software in the Wikimedia world, is available under an open source license, over four hundred thousand projects and organizations use it for hosting their content. For example, NASA uses it to organize its content around space missions and their knowledge base!

In addition, there are many other bots, tools, desktop and mobile apps that help with content access, creation, editing, and maintenance. For example, bots in particular help drastically reduce the workload of editors by automating repetitive and tedious tasks, such as fighting vandalism, suggesting articles to newcomers, fact-checking articles, and more. InternetArchiveBot is a popular bot that frequently communicates with the Wayback Machine to fix dead links on Wikipedia.

Tools are software applications that support various contributors in their work. For example, organizers can access tools for conducting editathons, running campaigns, educational courses around Wikipedia editing, and so on. As of May 2022, bots and tools contribute 36.6% of edits made to 870 Wikimedia wikis, demonstrating their significant impact on the ecosystem.

Kiwix is a well-known offline reader and a desktop application that provides access to Wikipedia in limited internet access regions, particularly in educational settings. Mobile apps for Wikipedia and Wikimedia Commons allow editors to contribute articles and media files through their devices too, making our knowledge platforms accessible to a larger audience around the world.

The next time you are browsing a Wikipedia article and notice frequent changes being made to it in real-time in the wake of a recent event, you might be able to visualize better what might be happening behind the scenes.

Wikipedia’s technical community

Wikipedia was launched in 2001. It had about ten developers at that time. Since the inception of the Wikimedia Foundation in 2003, the developer pool has vastly grown over these years. About a thousand developers are now contributing to various projects within our knowledge movement. This number fluctuates yearly, depending on the number of active contributors and staff members, initiatives supporting volunteer developers, global events such as the pandemic, and so on.

Members in the technical community contribute in various ways and roles. There are code contributors, documentarians, designers, advocates, mentors, community organizers, testers, translators, site administrators, and more.

According to a survey conducted for new developers, Wikimedia draws a lot of contributors from the United States, Europe, and India like other open source projects and is growing in different regions of the world.

Volunteer developers have similar motivations as Wikipedia editors. They join as contributors to support the free knowledge mission, learn and gain new skills, improve the experience of other editors, and so on. One of the volunteer developers from India says, “While I joined as an editor, I started to familiarize myself with the tech behind Wikipedia because there were significantly fewer contributors in the Hindi Wikipedia community who could address our local language needs through technology.”

Open multimedia and art resources Music and video at the Linux terminal 26 open source creative apps to try this year Film series: Open Source Stories Blender cheat sheet Kdenlive cheat sheet GIMP cheat sheet Latest audio and music articles Latest video editing articles

Between July 2021 and June 2022, looking only at code repositories hosted in Wikimedia’s Gerrit instance, 514 developers contributed 45,621 merged software changes to 1225 repositories. Of these contributions, 48.52% came from outside the Wikimedia Foundation by other organizations and independent developers. Some of these developers are also part of various user groups, chapters, and affiliate bodies working in different regions to promote the use and encourage contributions to Wikimedia projects. These numbers do not include the additional developers who chose to host their code externally instead, or code that is hosted directly on wiki pages, such as gadgets or modules.

Making a difference

Wikipedia is a vast repository of knowledge, available to everyone. In many ways, it’s the embodiment of the original vision of what the internet can and should be: A source of information, understanding, and collaboration.

You can be a part of Wikipedia as a contributor, either by sharing your knowledge in articles, or by helping to build the software that makes it all possible. If you’re interested in joining Wikimedia’s technical community, then explore the resources on our developer site, and learn how to get involved.

Wikipedia embodies the spirit of the original vision of the internet, and you can be a part of it.

Image by:

Opensource.com

Tools Education Internet Wikimedia Art and design What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

Build a Raspberry Pi monitoring dashboard in under 30 minutes

opensource.com - Mon, 03/06/2023 - 16:00
Build a Raspberry Pi monitoring dashboard in under 30 minutes Keyur Paralkar Mon, 03/06/2023 - 03:00

If you’ve ever wondered about the performance of your Raspberry Pi, then you might need a dashboard for your Pi. In this article, I demonstrate how to quickly building an on-demand monitoring dashboard for your Raspberry Pi so you can see your CPU performance, memory and disk usage in real time, and add more views and actions later as you need them.

If you’re already using Appsmith, you can also import the sample app directly and get started.

Appsmith

Appsmith is an open source, low-code app builder that helps developers build internal apps like dashboards and admin panels easily and quickly. It’s a great choice for your dashboard, and reduces the time and complexity of traditional coding approaches.

For the dashboard in this example, I display usage stats for:

  • CPU
    • Percentage utilization
    • Frequency or clock speed
    • Count
    • Temperature
  • Memory
    • Percentage utilization
    • Percentage available memory
    • Total memory
    • Free memory
  • Disk
    • Percentage disk utilization
    • Absolute disk space used
    • Available disk space
    • Total disk space
Creating an endpoint

You need a way to get this data from your Raspberry Pi (RPi) and into Appsmith. The psutils Python library is useful for monitoring and profiling, and the Flask-RESTful Flask extension creates a REST API.

Appsmith calls the REST API every few seconds to refresh data automatically, and gets a JSON object in response with all desired stats as shown:

{ "cpu_count": 4, "cpu_freq": [ 600.0, 600.0, 1200.0 ], "cpu_mem_avail": 463953920, "cpu_mem_free": 115789824, "cpu_mem_total": 971063296, "cpu_mem_used": 436252672, "cpu_percent": 1.8, "disk_usage_free": 24678121472, "disk_usage_percent": 17.7, "disk_usage_total": 31307206656, "disk_usage_used": 5292728320, "sensor_temperatures": 52.616 }1. Set up the REST API

If your Raspberry Pi doesn’t have Python on it yet, open a terminal on your Pi and run this install command:

$ sudo apt install python3

Now set up a Python virtual environment for your development:

$ python -m venv PiData

Next, activate the environment. You must do this after rebooting your Pi.

$ source PiData/bin/activate $ cd PiData

To install Flask and Flask-RESTful and dependencies you’ll need later, create a file in your Python virtual environment called requirements.txt and add these lines to it:

flask flask-restful gunicorn

Save the file, and then use pip to install them all at once. You must do this after rebooting your Pi.

(PyData)$ python -m pip install -r requirements.txt

Next, create a file named pi_stats.py to house the logic for retrieving the RPi’s system stats with psutils. Paste this code into your pi_stat.py file:

from flask import Flask from flask_restful import Resource, Api import psutil app = Flask(__name__) api = Api(app) class PiData(Resource): def get(self): return "RPI Stat dashboard" api.add_resource(PiData, '/get-stats') if __name__ == '__main__': app.run(debug=True)

Here’s what the code is doing:

  • Use app = Flask(name) to define the app that nests the API object.
  • Use Flask-RESTful’s API method to define the API object.
  • Define PiData as a concrete Resource class in Flask-RESTful to expose methods for each supported HTTP method.
  • Attach the resource, PiData, to the API object, api, with api.add_resource(PiData, '/get-stats').
  • Whenever you hit the URL /get-stats, PiData is returned as the response.
2. Read stats with psutils

To get the stats from your Pi, you can use these built-in functions from psutils:

  • cpu_percentage, cpu_count, cpu_freq, and sensors_temperatures functions for the percentage utilization, count, clock speed, and temperature respectively, of the CPU sensors_temperatures reports the temperature of all the devices connected to the RPi. To get just the CPU’s temperature, use the key cpu-thermal.
  • virtual_memory for total, available, used, and free memory stats in bytes.
  • disk_usage to return the total, used, and free stats in bytes.

Combining all of the functions in a Python dictionary looks like this:

system_info_data = { 'cpu_percent': psutil.cpu_percent(1), 'cpu_count': psutil.cpu_count(), 'cpu_freq': psutil.cpu_freq(), 'cpu_mem_total': memory.total, 'cpu_mem_avail': memory.available, 'cpu_mem_used': memory.used, 'cpu_mem_free': memory.free, 'disk_usage_total': disk.total, 'disk_usage_used': disk.used, 'disk_usage_free': disk.free, 'disk_usage_percent': disk.percent, 'sensor_temperatures': psutil.sensors_temperatures()\['cpu-thermal' ][0].current, }

The next section uses this dictionary.

3. Fetch data from the Flask-RESTful API

To see data from your Pi in the API response, update pi_stats.py to include the dictionary system_info_data in the class PiData:

from flask import Flask from flask_restful import Resource, Api import psutil app = Flask(__name__) api = Api(app) class PiData(Resource): def get(self): memory = psutil.virtual_memory() disk = psutil.disk_usage('/') system_info_data = { 'cpu_percent': psutil.cpu_percent(1), 'cpu_count': psutil.cpu_count(), 'cpu_freq': psutil.cpu_freq(), 'cpu_mem_total': memory.total, 'cpu_mem_avail': memory.available, 'cpu_mem_used': memory.used, 'cpu_mem_free': memory.free, 'disk_usage_total': disk.total, 'disk_usage_used': disk.used, 'disk_usage_free': disk.free, 'disk_usage_percent': disk.percent, 'sensor_temperatures': psutil.sensors_temperatures()['cpu-thermal'][0].current, } return system_info_data api.add_resource(PiData, '/get-stats') if __name__ == '__main__': app.run(debug=True)

Your script’s ready. Run the PiData.py script:

$ python PyData.py * Serving Flask app "PiData" (lazy loading) * Environment: production WARNING: This is a development server. Do not run this in a production environment. * Debug mode: on * Running on http://127.0.0.1:5000 (Press CTRL+C to quit) * Restarting with stat * Debugger is active!

You have a working API!

4. Make the API available to the internet

You can interact with your API on your local network. To reach it over the internet, however, you must open a port in your firewall and forward incoming traffic to the port made available by Flask. However, as the output of your test advised, running a Flask app from Flask is meant for development, not for production. To make your API available to the internet safely, you can use the gunicorn production server, which you installed during the project setup stage.

Now you can start your Flask API. You must do this any time you’ve rebooted your Pi.

$ gunicorn -w 4 'PyData:app' Serving on http://0.0.0.0:8000

To reach your Pi from the outside world, open a port in your network firewall and direct incoming traffic to the IP address of your PI, at port 8000.

First, get the internal IP address of your Pi:

$ ip addr show | grep inet

Internal IP addresses start with 10 or 192 or 172.

Next, you must configure your firewall. There’s usually a firewall embedded in the router you get from your internet service provider (ISP). Generally, you access your home router through a web browser. Your router’s address is sometimes printed on the bottom of the router, and it begins with either 192.168 or 10. Every device is different, though, so there’s no way for me to tell you exactly what you need to click on to adjust your settings. For a full description of how to configure your firewall, read Seth Kenlon’s article Open ports and route traffic through your firewall.

Alternately, you can use localtunnel to use a dynamic port-forwarding service.

Once you’ve got traffic going to your Pi, you can query your API:

$ curl https://example.com/get-stats { "cpu_count": 4, "cpu_freq": [ 600.0, 600.0, 1200.0 ], "cpu_mem_avail": 386273280, ...

If you have gotten this far, the toughest part is over.

5. Repetition

If you reboot your Pi, you must follow these steps:

  1. Reactivate your Python environment with source
  2. Refresh the application dependencies with pip
  3. Start the Flask application with gunicorn

Your firewall settings are persistent, but if you’re using localtunnel, then you must also start a new tunnel after a reboot.

You can automate these tasks if you like, but that’s a whole other tutorial. The final section of this tutorial is to build a UI on Appsmith using the drag-and-drop widgets, and a bit of Javascript, to bind your RPi data to the UI. Believe me, it’s easy going from here on out!

Build the dashboard on Appsmith. Image by:

(Keyur Paralkar, CC BY-SA 4.0)

To get to a dashboard like this, you need to connect the exposed API endpoint to Appsmith, build the UI using Appsmith’s widgets library, and bind the API’s response to your widgets. If you’re already using Appsmith, you can just import the sample app directly and get started.

If you haven’t done so already, sign up for a free Appsmith account. Alternately, you can self-host Appsmith.

Connect the API as an Appsmith datasource

Sign in to your Appsmith account.

  1. Find and click the + button next to QUERIES/JS in the left nav.
  2. Click Create a blank API.
  3. At the top of the page, name your project PiData.
  4. Get your API’s URL. If you’re using localtunnel, then that’s a localtunnel.me address, and as always append /get-stats to the end for the stat data. Paste it into the first blank field on the page, and click the RUN button.

Confirm that you see a successful response in the Response pane.

Image by:

(Keyur Paralkar, CC BY-SA 4.0)

Build the UI

The interface for AppSmith is pretty intuitive, but I recommend going through building your first application on Appsmith tutorial if you feel lost.

For the title, drag and drop a Text, Image, and Divider widget onto the canvas. Arrange them like this:

Image by:

(Keyur Paralkar, CC BY-SA 4.0)

The Text widget contains the actual title of your page. Type in something cooler than “Raspberry Pi Stats”.

The Image widget houses a distinct logo for the dashboard. You can use whatever you want.

Use a Switch widget for a toggled live data mode. Configure it in the Property pane to get data from the API you’ve built.

For the body, create a place for CPU Stats with a Container widget using the following widgets from the Widgets library on the left side:

  • Progress Bar
  • Stat Box
  • Chart

Do the same for the Memory and Disk stats sections. You don’t need a Chart for disk stats, but don’t let that stop you from using one if you can find uses for it.

Your final arrangement of widgets should look something like this:

Image by:

(Keyur Paralkar, CC BY-SA 4.0)

The final step is to bind the data from the API to the UI widgets you have.

Bind data to the widgets

Head back to the canvas and find your widgets in sections for the three categories. Set the CPU Stats first.

To bind data to the Progress Bar widget:

  1. Click the Progress Bar widget to see the Property pane on the right.
  2. Look for the Progress property.
  3. Click the JS button to activate Javascript.
  4. Paste {{PiData.data.cpu_percent ?? 0}} in the field for Progress. That code references the stream of data from of your API named PiData. Appsmith caches the response data within the .data operator of PiData. The key cpu_percent contains the data Appsmith uses to display the percentage of, in this case, CPU utilization.
  5. Add a Text widget below the Progress Bar widget as a label.
Image by:

(Keyur Paralkar, CC BY-SA 4.0)

There are three Stat Box widgets in the CPU section. Binding data to each one is the exact same as for the Progress Bar widget, except that you bind a different data attribute from the .data operator. Follow the same procedure, with these exceptions:

  • {{${PiData.data.cpu_freq[0]} ?? 0 }} to show clock speed.
  • {{${PiData.data.cpu_count} ?? 0 }} for CPU count.
  • {{${(PiData.data.sensor_temperatures).toPrecision(3)} ?? 0 }} for CPU temperature data.

Assuming all goes to plan, you end up with a pretty dashboard like this one:

Image by:

(Keyur Paralkar, CC BY-SA 4.0)

CPU utilization trend

You can use a Chart widget to display the CPU utilization as a trend line, and have it automatically update over time.

First, click the widget, find the Chart Type property on the right, and change it to LINE CHART. To see a trend line, store cpu_percent in an array of data points. Your API currently returns this as a single data point in time, so use Appsmith’s storeValue function (an Appsmith-native implementation of a browser’s setItem method) to get an array.

Click the + button next to QUERIES/JS and name it utils.

Paste this Javascript code into the Code field:

export default { getLiveData: () => { //When switch is on: if (Switch1.isSwitchedOn) { setInterval(() => { let utilData = appsmith.store.cpu_util_data; PiData.run() storeValue("cpu_util_data", [...utilData, { x: PiData.data.cpu_percent, y: PiData.data.cpu_percent }]); }, 1500, 'timerId') } else { clearInterval('timerId'); } }, initialOnPageLoad: () => { storeValue("cpu_util_data", []); } }

To initialize the Store, you’ve created a JavaScript function in the object called initialOnPageLoad, and you’ve housed the storeValue function in it.

You store the values from cpu_util_data into the storeValue function using storeValue("cpu_util_data", []);. This function runs on page load.

So far, the code stores one data point from cpu_util_data in the Store each time the page is refreshed. To store an array, you use the x and y subscripted variables, both storing values from the cpu_percent data attribute.

You also want this data stored automatically by a set interval between stored values. When the function setInterval is executed:

  1. The value stored in cpu_util_data is fetched.
  2. The API PiData is called.
  3. cpu_util_data is updated as x and y variables with the latest cpu_percent data returned.
  4. The value of cpu_util_data is stored in the key utilData.
  5. Steps 1 through 4 are repeated if and only if the function is set to auto-execute. You set it to auto-execute with the Switch widget, which explains why there is a getLiveData parent function.

Navigate to the Settings tab to find all the parent functions in the object and set initialOnPageLoad to Yes in the RUN ON PAGE LOAD option.

Image by:

(Keyur Paralkar, CC BY-SA 4.0)

Now refresh the page for confirmation

Return to the canvas. Click the Chart widget and locate the Chart Data property. Paste the binding {{ appsmith.store.disk_util_data }} into it. This gets your chart if you run the object utils yourself a few times. To run this automatically:

  1. Find and click the Live Data Switch widget in your dashboard’s title.
  2. Look for the onChange event.
  3. Bind it to {{ utils.getLiveData() }}. The Javascript object is utils, and getLiveData is the function that activates when you toggle the Switch on, which fetches live data from your Raspberry Pi. But there’s other live data, too, so the same switch works for them. Read on to see how.
Bind all the data

Binding data to the widgets in the Memory and Disk sections is similar to how you did it for the CPU Stats section.

For Memory, bindings change to:

  • {{( PiData.data.cpu_mem_avail/1000000000).toPrecision(2) \* 100 ?? 0 }} for the Progress Bar.
  • {{ \${(PiData.data.cpu_mem_used/1000000000).toPrecision(2)} ?? 0 }} GB, {{ \${(PiData.data.cpu_mem_free/1000000000).toPrecision(2)} ?? 0}} GB, and {{ \${(PiData.data.cpu_mem_total/1000000000).toPrecision(2)} ?? 0 }} GB for the three Stat Box widgets.

For Disk, bindings on the Progress Bar, and Stat Box widgets change respectively to:

  • {{ PiData.data.disk_usage_percent ?? 0 }}
  • {{ \${(PiData.data.disk_usage_used/1000000000).toPrecision(2)} ?? 0 }} GB
  • {{ \${(PiData.data.disk_usage_free/1000000000).toPrecision(2)} ?? 0 }} GB and {{ \${(PiData.data.disk_usage_total/1000000000).toPrecision(2)} ?? 0 }} GB for the three Stat Box widgets.

The Chart here needs updating the utils object you created for CPU Stats with a storeValue key called disk_util_data nested under getLiveData that follows the same logic as cpu_util_data. For the disk utilization chart, we store disk_util_data that follows the same logic as that of the CPU utilization trend chart.

export default { getLiveData: () => { //When switch is on: if (Switch1.isSwitchedOn) { setInterval(() => { const cpuUtilData = appsmith.store.cpu_util_data; const diskUtilData = appsmith.store.disk_util_data; PiData.run(); storeValue("cpu_util_data", [...cpuUtilData, { x: PiData.data.cpu_percent,y: PiData.data.cpu_percent }]); storeValue("disk_util_data", [...diskUtilData, { x: PiData.data.disk_usage_percent,y: PiData.data.disk_usage_percent }]); }, 1500, 'timerId') } else { clearInterval('timerId'); } }, initialOnPageLoad: () => { storeValue("cpu_util_data", []); storeValue("disk_util_data", []); } }

Visualizing the flow of data triggered by the Switch toggling live data on and off with the utils Javascript object looks like this:

Image by:

(Keyur Paralkar, CC BY-SA 4.0)

Toggled on, the charts change like this:

Image by:

(Keyur Paralkar, CC BY-SA 4.0)

More on Raspberry Pi What is Raspberry Pi? eBook: Guide to Raspberry Pi Getting started with Raspberry Pi cheat sheet eBook: Running Kubernetes on your Raspberry Pi Whitepaper: Data-intensive intelligent applications in a hybrid cloud blueprint Understanding edge computing Our latest on Raspberry Pi

Pretty, minimalistic, and totally useful.

Enjoy

As you get more comfortable with psutils, Javascript, and Appsmith, I think you’ll find you can tweak your dashboard easily and endlessly to do really cool things like:

  • See trends from the previous week, month, quarter, year, or any custom range that your RPi data allows
  • Build an alert bot for threshold breaches on any stat
  • Monitor other devices connected to your Raspberry Pi
  • Extend psutils to another computer with Python installed
  • Monitor your home or office network using another library
  • Monitor your garden
  • Track your own life habits

Until the next awesome build, happy hacking!

Use Python to make an API for monitoring your Raspberry Pi hardware and build a dashboard with Appsmith.

Image by:

Internet Archive Book Images. Modified by Opensource.com. CC BY-SA 4.0

Raspberry Pi What to read next This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License. Register or Login to post a comment.

Linux 6.3-rc1 Brings File-System Optimizations, HID-BPF, More Intel & AMD Features

Phoronix - Mon, 03/06/2023 - 07:29
The merge window for Linux 6.3 is now over and Linus Torvalds just released Linux 6.3-rc1...

Linux 6.3 Drops Support For The Intel ICC Compiler

Phoronix - Mon, 03/06/2023 - 04:23
On this last day of the Linux 6.3 kernel merge window, Linus Torvalds merged the patch dropping support for Intel (ICC) compiler support. Specifically this is Intel's long-standing ICC compiler now known as the "Intel C++ Compiler Classic" prior to its transition to being LLVM/Clang-based with the modern Intel DPC++ compiler...

Testing The First PCIe Gen 5.0 NVMe SSD On Linux Has Been Disappointing

Phoronix - Mon, 03/06/2023 - 02:48
This past week saw the first two consumer PCIe 5.0 NVMe solid-state drives released to retail: the Gigabyte AORUS Gen5 10000 and the Inland TD510. I've been testing the Inland TD510 2TB Gen 5 NVMe SSD the past few days. While in simple I/O testing it can hit speeds almost up to 10,000 MB/s reads and writes, for more complex workloads it quickly dropped against popular PCIe Gen 4.0 NVMe SSD options. In my testing thus far of this first consumer Gen5 NVMe SSD it's left me far from impressed.

Latest System76 Intel-Powered Laptops Added To Coreboot

Phoronix - Sun, 03/05/2023 - 21:46
Merged on Saturday to upstream Coreboot was support for some of the latest Intel Alderlake (and signs of Raptor Lake) powered laptops from Linux vendor System76...

Pages