The Linux Foundation

Subscribe to The Linux Foundation feed The Linux Foundation
Decentralized innovation, built on trust.
Updated: 36 min 55 sec ago

Interview with Stephen Hendrick, Vice President of Research, Linux Foundation Research

Mon, 06/14/2021 - 23:07
Jason Perlow, Director of Project Insights and Editorial Content, spoke with Stephen Hendrick about Linux Foundation Research and how it will promote a greater understanding of the work being done by open source projects, their communities, and the Linux Foundation.

JP: It’s great to have you here today, and also, welcome to the Linux Foundation. First, can you tell me a bit about yourself, where you are from, and your interests outside work?

SH: I’m from the northeastern US.  I started as a kid in upstate NY and then came to the greater Boston area when I was 8.  I grew up in the Boston area, went to college back in upstate NY, and got a graduate degree in Boston.  I’ve worked in the greater Boston area since I was out of school and have really had two careers.  My first career was as a programmer, which evolved into project and product management doing global cash management for JPMC.  When I was in banking, IT was approached very conservatively, with a tagline like yesterday’s technology, tomorrow.  The best thing about JPMC was that it was where I met my wife.  Yes, I know, you’re never supposed to date anybody from work.  But it was the best decision I ever made.  After JPMC, my second career began as an industry analyst working for IDC, specializing in application development and deployment tools and technologies.  This was a long-lived 25+ year career followed by time with a couple of boutique analyst firms and cut short by my transition to the Linux Foundation.

Until recently, interests outside of work mainly included vertical pursuits — rock climbing during the warm months and ice climbing in the winter.  The day I got engaged, my wife (to be) and I had been climbing in the morning, and she jokes that if she didn’t make it up that last 5.10, I wouldn’t have offered her the ring.  However, having just moved to a house overlooking Mt. Hope bay in Rhode Island, our outdoor pursuits will become more nautically focused.

JP: And from what organization are you joining us?

SH: I was lead analyst at Enterprise Management Associates, a boutique industry analyst firm.  I initially focused my practice area on DevOps, but in reality, since I was the only person with application development and deployment experience, I also covered adjacent markets that included primary research into NoSQL, Software Quality, PaaS, and decisioning.  

JP: Tell me a bit more about your academic and quantitative analysis background; I see you went to Boston University, which was my mom’s alma mater as well. 

SH:  I went to BU for an MBA.  In the process, I concentrated in quantitative methods, including decisioning, Bayesian methods, and mathematical optimization.  This built on my undergraduate math and economics focus and was a kind of predecessor to today’s data science focus.  The regression work that I did served me well as an analyst and was the foundation for much of the forecasting work I did and industry models that I built.  My qualitative and quantitative empirical experience was primarily gained through experience in the more than 100 surveys and in-depth interviews I have fielded.  

JP: What disciplines do you feel most influence your analytic methodology? 

SH: We now live in a data-driven world, and math enables us to gain insight into the data.  So math and statistics are the foundation that analysis is built on.  So, math is most important, but so is the ability to ask the right questions.  Asking the right questions provides you with the data (raw materials) shaped into insights using math.  So analysis ends up being a combination of both art and science.

JP: What are some of the most enlightening research projects you’ve worked on in your career? 

SH:  One of the most exciting projects I cooked up was to figure out how many professional developers there were in the world, by country, with five years of history and a 5-year forecast.  I developed a parameterized logistics curve tuned to each country using the CIA, WHO, UN, and selected country-level data.  It was a landmark project at the time and used by the world’s leading software and hardware manufacturers. I was flattered to find out six years later that another analyst firm had copied it (since I provided the generalized equation in the report).

I was also interested in finding that an up-and-coming SaaS company had used some of my published matrix data on language use, which showed huge growth in Ruby.  This company used my findings and other evidence to help drive its acquisition of a successful Ruby cloud application platform.

JP: I see that you have a lot of experience working at enterprise research firms, such as IDC, covering enterprise software development. What lessons do you think we can learn from the enterprise and how to approach FOSS in organizations adopting open source technologies?

SH: The analyst community has struggled at times to understand the impact of OSS. Part of this stems from the economic foundation of the supply side research that gets done.  However, this has changed radically over the past eight years due to the success of Linux and the availability of a wide variety of curated open source products that have helped transform and accelerate the IT industry.  Enterprises today are less concerned about whether a product/service is open or closed source.  Primarily they want tools that are best able to address their needs. I think of this as a huge win for OSS because it validates the open innovation model that is characteristic of OSS. 

JP: So you are joining the Linux Foundation at a time when we have just gotten our research division off the ground. What are the kind of methodologies and practices that you would like to take from your years at firms like IDC and EMA and see applied to our new LF Research?

SH: LF is in the enviable position of having close relationships with IT luminaries, academics, hundreds of OSS projects, and a significant portion of the IT community.  The LF has an excellent opportunity to develop world-class research that helps the IT community, industry, and governments better understand OSS’s pivotal role in shaping IT going forward.

I anticipate that we will use a combination of quantitative and qualitative research to tell this story.  Quantitative research can deliver statistically significant findings, but qualitative interview-based research can provide examples, sound bites, and perspectives that help communicate a far more nuanced understanding of OSS’s relationship with IT.

JP: How might these approaches contrast with other forms of primary research, specifically human interviews? What are the strengths and weaknesses of the interview process?

SH: Interviews help fill in the gaps around discrete survey questions in ways that can be insightful, personal, entertaining, and unexpected.  Interviews can also provide context for understanding the detailed findings from surveys and provide confirmation or adjustments to models based on underlying data.

JP: What are you most looking forward to learning through the research process into open source ecosystems?

SH: The transformative impact that OSS is having on the digital economy and helping enterprises better understand when to collaborate and when to compete.

JP: What insights do you feel we can uncover with the quantitative analysis we will perform in our upcoming surveys? Are there things that we can learn about the use of FOSS in organizations?

SH: A key capability of empirical research is that it can be structured to highlight how enterprises are leveraging people, policy, processes, and products to address market needs.  Since enterprises are widely distributed in their approach and best/worst practices to a particular market, data can help us build maturity models that provide advice on how enterprises can shape strategy and decision based on the experience and best practices of others.

JP: Trust in technology (and other facets of society) is arguably at an all-time low right now. Do you see a role for LF Research to help improve levels of trust in not only software but in open source as an approach to building secure technologies? What are the opportunities for this department?

SH: I’m reminded by the old saying that there are “lies, damned lies, and then there are statistics.” If trust in technology is at an all-time low, it’s because there are people in this world with a certain moral flexibility, and the IT industry has not yet found effective ways to prevent the few from exploiting the many.  LF Research is in the unique position to help educate and persuade through factual data and analysis on accelerating improvements in IT security.

JP: Thanks, Steve. It’s been great talking to you today!

The post Interview with Stephen Hendrick, Vice President of Research, Linux Foundation Research appeared first on Linux Foundation.

FINOS Announces 2021 State of Open Source in Financial Services Survey

Fri, 06/11/2021 - 00:00

FINOS, the fintech open source foundation, and its research partners, Linux Foundation Research, Scott Logic, WIPRO, and GitHub, are conducting a survey as part of a research project on the state of open source adoption, contribution, and readiness in the financial services industry. 

The increased prevalence, importance, and value of open source is well understood and widely reported by many industry surveys and studies. However, the rate at which different industries are acknowledging this shift and adapting their own working practices to capitalize on the new world of open source-first differs considerably.

The financial services industry has been a long-time consumer of open source software, however many are struggling in contributing to, and publishing, open source software and standards, and adopting open source methodologies. A lack of understanding of how to build and deploy efficient tooling and governance models are often seen as a limiting factor.

This survey and report seeks to explore open source within the context of financial services organizations; including banks, asset managers, and hedge funds but will be designed as a resource to be used by all financial services organizations, with the goal to make this an annual survey with a year-on-year tracing of metrics. 

Please participate now; we intend to close the survey in early July. Privacy and confidentiality are important to us. Neither participant names, nor their company names, will be published in the final results.

To take the 2021 FINOS Survey, click the button below:

Take Survey (EN) BONUS

As a thank-you for completing this survey, you will receive a 75% discount code on enrollment in the Linux Foundation’s Open Source Management & Strategy training program, a $375 savings. This seven-course online training series is designed to help executives, managers, and software developers understand and articulate the basic concepts for building effective open source practices within their organization.


PRIVACY

Your name and company name will not be published. Reviews are attributed to your role, company size, and industry. Responses will be subject to the Linux Foundation’s Privacy Policy, available at https://linuxfoundation.org/privacy. Please note that survey partners who are not Linux Foundation employees will be involved in reviewing the survey results. If you do not want them to have access to your name or email address, please do not provide this information.

VISIBILITY

We will summarize the survey data and share the findings during Open Source Strategy Forum, 2021. The summary report will be published on the FINOS and Linux Foundation websites. 

QUESTIONS

If you have questions regarding this survey, please email us at info@finos.org

The post FINOS Announces 2021 State of Open Source in Financial Services Survey appeared first on Linux Foundation.

TODO Group Announces 2021 State of OSPO Survey

Fri, 06/11/2021 - 00:00

The TODO Group, together with Linux Foundation Research and The New Stack, is conducting a survey as part of a research project on the prevalence and outcomes of open source programs among different organizations across the globe. 

Open source program offices (OSPOs) help set open source strategies and improve an organization’s software development practices. Since 2018, the TODO Group has conducted surveys to assess the state of open source programs across the industry. Today, we are pleased to announce the launch of the 2021 edition featuring additional questions to add value to the community.

The survey will generate insights into the following areas, including:

  • The extent of adoption of open source programs and initiatives 
  • Concerns around the hiring of open source developers 
  • Perceived benefits and challenges of open source programs
  • The impact of open source on organizational strategy

We hope to expand the pool of respondents by translating the survey into Chinese and Japanese. Please participate now; we intend to close the survey in early July. Privacy and confidentiality are important to us. Neither participant names, nor their company names, will be published in the final results.

To take the 2021 OSPO Survey, click the button below:

Take Survey (EN) Take Survey (調査) Take Survey (民意调查) .wp-block-button__link { background-color:#51881a !important; color:white !important } BONUS

As a thank you for completing this survey, you will receive a 75% discount code on enrollment in The Linux Foundation’s Open Source Management & Strategy training program, a $375 savings. This seven-course online training series is designed to help executives, managers, and software developers understand and articulate the basic concepts for building effective open source practices within their organization.


PRIVACY

Your name and company name will not be published. Reviews are attributed to your role, company size, and industry. Responses will be subject to the Linux Foundation’s Privacy Policy, available at https://linuxfoundation.org/privacy. Please note that survey partners who are not Linux Foundation employees will be involved in reviewing the survey results. If you do not want them to have access to your name or email address, please do not provide this information.

VISIBILITY

We will summarize the survey data and share the findings during OSPOCon 2021. The summary report will be published on the TODO Group and Linux Foundation websites. 

QUESTIONS

If you have questions regarding this survey, please email us at info@todogroup.org

The post TODO Group Announces 2021 State of OSPO Survey appeared first on Linux Foundation.

New Open Source Project Uses Machine Learning to Inform Quality Assurance for Construction in Emerging Nations

Thu, 06/10/2021 - 23:00

Linux Foundation with support from IBM and Call for Code hosts ‘Intelligent Supervision Assistant for Construction’ project from Build Change to help builders identify structural issues in masonry walls or concrete columns, especially in areas affected by disasters

SAN FRANCISCO, June 10, 2021 – The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced it will host the Intelligent Supervision Assistant for Construction (ISAC-SIMO) project, which was created by Build Change with a grant from IBM as part of the Call for Code initiative. The Autodesk Foundation, a Build Change funder, also contributed pro-bono expertise to advise the project’s development.

Build Change helps save lives in earthquakes and windstorms. Its mission is to prevent housing loss caused by disasters by transforming the systems that regulate, finance, build and improve houses around the world. 

ISAC-SIMO packages important construction quality assurance checks into a convenient mobile app. The tool harnesses the power of machine learning and image processing to provide feedback on specific construction elements such as masonry walls and reinforced concrete columns. Users can choose a building element check and upload a photo from the site to receive a quick assessment.

“ISAC-SIMO has amazing potential to radically improve construction quality and ensure that homes are built or strengthened to a resilient standard, especially in areas affected by earthquakes, windstorms, and climate change,” said Dr. Elizabeth Hausler, Founder & CEO of Build Change. “We’ve created a foundation from which the open source community can develop and contribute different models to enable this tool to reach its full potential. The Linux Foundation, building on the support of IBM over these past three years, will help us build this community.”

ISAC-SIMO was imagined as a solution to gaps in technical knowledge that were apparent in the field. The app ensures that workmanship issues can be more easily identified by anyone with a phone, instead of solely relying on technical staff. It does this by comparing user-uploaded images against trained models to assess whether the work done is broadly acceptable (go) or not (no go) along with a specific score. The project is itself built on open source software, including Python through Django, Jupyter Notebooks, and React Native.

“Due to the pandemic, the project deliverables and target audience have evolved. Rather than sharing information and workflows between separate users within the app, the app has pivoted to provide tools for each user to perform their own checks based on their role and location. This has led to a general framework that is well-suited for plugging in models from the open source community, beyond Build Change’s original use case,” said Daniel Krook, IBM Chief Technology Officer for the Call for Code Global Initiative.

IBM and The Linux Foundation have a rich history of deploying projects that fundamentally make change and progress in society through innovation – and remain committed during COVID-19. The winner of the 2018 Call for Code Global Challenge, Project OWL, contributed its IoT device firmware in March 2020 as the ClusterDuck Protocol, and since then, twelve more Call for Code deployment projects like ISAC-SIMO that address disasters, climate change, and racial justice, have been open sourced for communities that need them most.

The project encourages new users to contribute and to deploy the software in new environments around the world. Priorities for short term updates include improvements in user interface, contributions to the image dataset for different construction elements, and support to automatically detect if the perspective of an image is flawed. For more information, please visit: ​https://www.isac-simo.net/docs/contribute/

About The Linux Foundation

Founded in 2000, The Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. The Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

###

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page:  https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Media Contact

Jennifer Cloer
for the Linux Foundation
503-867-2304
jennifer@storychangesculture.com

The post New Open Source Project Uses Machine Learning to Inform Quality Assurance for Construction in Emerging Nations appeared first on Linux Foundation.

The Zephyr Project Celebrates 5th Anniversary with new members and inaugural Zephyr Developer Summit on June 8-10

Thu, 06/03/2021 - 23:00

AVSystem, Golioth, Pat-Eta Electronics, RISC-V International and RISE Research Institutes of Sweden joins Zephyr’s global open source RTOS ecosystem

SAN FRANCISCO, June 3, 2021 The Zephyr Project,an open source project at the Linux Foundation that builds a safe, secure and flexible real-time operating system (RTOS) for resource-constrained devices, continues to gain momentum with its 5th anniversary this year. To celebrate the milestone, the Zephyr Project is hosting its inaugural Zephyr Developer Summit on June 8-10. The virtual event, which is free to attend, features several Zephyr leaders presenting real-world use cases, best practices, tutorials and more.

Happy 5th Anniversary

Launched in 2016 by the Linux Foundation, the Zephyr Project has continued to grow its technical community each year. Today, almost 1,000 contributors have helped the project surpass 50,000 commits building advanced support for multiple architectures such as ARC, Arm, Intel, Nios, RISC-V, SPARC and Tensilica and more than 250 boards.

The first-ever Zephyr Developer Summit will offer community members a chance to learn more about the fastest growing RTOS in an informal educational environment.

“We are kicking off our first Developer Summit with an impressive line-up of Zephyr thought leaders and ambassadors for the growing Zephyr community of contributors and users.” said Joel Stapleton, Chair of the Zephyr Project Governing Board and Principal Engineer Manager at Nordic Semiconductor. “The strength of engagement the project has with its members and IoT solution providers reflects the importance of open source efforts to build secure and safe embedded technologies for increasingly connected applications in industrial, smart home, wearables and energy; and for computing platforms integrating microcontrollers with ever-increasing capabilities and functions.”

Sample summit sessions include power management, USB support, motor control; user presentations that showcase Zephyr with Renode and TensorFlow Lite and  RISC-V and contributor spotlights for securing MCUBoot, using OPC UA, energy-efficient device testing and developing hardware. Proposals were reviewed by the Programming Committee, which includes Anas Nashif, Intel; Carles Cufi, Nordic Semiconductor; Jonathan Beri, Golioth; Keith Short, Google; Maureen Helm, NXP; and Olof Johansson, Facebook. To see the complete schedule, click here. The registration deadline is June 4, click here to register.

The U.S. Executive Order on Cybersecurity

Less than a month ago, the United States White House released an Executive Order on Improving the Nation’s Cybersecurity that addressed the malicious cyber attacks that have become more frequent in the last few years. In a blog, the Linux Foundation responded how Zephyr RTOS, along with several other projects, has already built some of the support needed for a more secure future. Zephyr is able to generate Software Bill of Materials (SBOMs) automatically during the build and this capability will be available in the upcoming 2.6 release. It is one of the few open source projects that is a CVE Numbering Authority(CNA) and has an active Project Security Incident Response Team(PSIRT) that manages responsible disclosure of vulnerabilities to product makers. 

Product creators using zephyr can sign up for free to be notified of vulnerabilities.  

“SBOMs can communicate details about a software package’s contents, being able to understand exactly which source files are included in a resource constrained software image is key to understanding if it may be vulnerable to an exploit,” said Kate Stewart, Vice President of Dependable Embedded Systems at the Linux Foundation. “SBOMs created by manual processes can often be incomplete, incorrect or out-of-date as a software package advances. By being able to generate the SBOM during the build, and take it to the source file level, not just the component level, better diagnosis and detection of vulnerable states is possible and addresses some of the best practices  mentioned in the EO. Zephyr is being used today in thousands of wearables and other products with constrained environments. By automatically creating SBOMs during builds, the development process becomes easier, more efficient and improves maintainability in field.”

Zephyr’s Growing Ecosystem

Today, the Zephyr Project also welcomes AVSystem, Golioth, Pat-Eta Electronics, RISC-V and RISE Research Institutes of Sweden to its global RTOS ecosystem. These new members join Adafruit, Antmicro, BayLibre, Eclipse Foundation, Facebook, Fiware, Foundries.io, Google, Intel, Laird Connectivity, Linaro, Memfault, Nordic Semiconductor, NXP, Oticon, Parasoft, SiFive, Synopsys and teenage engineering, among others.

“We see amazing opportunities for IoT deployments involving resource-constrained devices operating in cellular LPWA networks,” said Marcin Nagy, Product Director for IoT, AVSystem. “We are sure that combining the Zephyr RTOS with our expertise in the Lightweight M2M standard will contribute to the acceleration of secure and standards-based IoT launches.”

“We can speak at length about the technical merits of Zephyr – the kernel design, native networking, scalable board support model and so on – but the largest differentiator is the community,” says Jonathan Beri, CEO of Golioth. “From chipset vendors to ecosystem players, it feels like we’re rising the tide for everyone to make the most secure & reliable open source RTOS in the market and we couldn’t be more excited to contribute to the project and community.”

“We are happy to be part of the Zephyr Project and hope to bring it more into the academic environment, especially within STEM (Science Technology, Engineering and Mathematics),” said Sanyaade Adekoya, Developer, Programmer and Lecturer at Pat-Eta Electronics. “It has been challenging to bring RTOSes within the academic research sector and getting them in the hands of undergraduate learners. Our research extends the use of Zephyr RTOS in IoT, Edge Computing, Robotics, Smart and Wearable devices. The Zephyr Project will be a driving platform for our students that will make it easier for them to create ideas, projects, innovations and more. We look forward to showcasing our students’ Zephyr-related projects. ”

“RISC-V and Zephyr were both designed to drive innovation in the hardware space with open source technologies that are accessible to everyone,” said Mark Himelstein, CTO of RISC-V. “Many of our members are already taking advantage of the flexibility of RISC-V and Zephyr to design end-to-end open source solutions for resource-constrained devices. We look forward to collaborating with the Zephyr Project to offer even more opportunities for the open source community to innovate.”

“Zephyr RTOS enables us to rapidly prototype Thread wireless networks and is an excellent research platform for our work in IoT security,” said Samuel Lindemer, Research Engineer at RISE Research Institutes of Sweden. “The interactive shell and configuration menu make it intuitive for new users, and the open-source community support is unparalleled.”

To learn more about Zephyr RTOS, visit the Zephyr website and blog.

About the Zephyr Project

The Zephyr Project is a small, scalable real-time operating system for use on resource-constrained systems supporting multiple architectures. To learn more, please visit www.zephyrproject.org.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

###

The post The Zephyr Project Celebrates 5th Anniversary with new members and inaugural Zephyr Developer Summit on June 8-10 appeared first on Linux Foundation.

Interview with Daniel Scales, Chief Brand Counsel, Linux Foundation

Thu, 06/03/2021 - 21:00

Jason Perlow, Director of Project Insights and Editorial Content at the Linux Foundation, spoke with Daniel Scales about the importance of protecting trademarks in open source projects.

JP: It’s great to have you here today, and also, welcome to the Linux Foundation. First, can you tell me a bit about yourself, where you are from, and your interests outside work?

DS: Thanks, Jason! It is great to be here. I grew up in Upstate New York, lived in Washington and London for a few years after college, and have been in Boston for the last 20+ years. Outside of work, I coach my daughter’s soccer team, I like to cook and play my bass guitar, and I am really looking forward to getting back to some live music and sporting events. 

JP: And from what organization are you joining us?

DS: I have been with the Boston law firm Choate, Hall & Stewart since 2011. In addition to advising The Linux Foundation and other clients on trademark matters, I helped clients with open source license questions, technology licenses, and IP-focused transactions.  Before Choate, I worked as IP Counsel at Avid Technology, where I managed their trademark portfolio through a global rebranding and supported the engineering team on technology licenses. 

JP: So, how did you get into Intellectual Property law?

DS: Great question.  I studied economics in college and took a fantastic senior seminar on the economics of intellectual property.  After graduation, I worked in the economics consulting group at Ernst & Young.  A big part of my job there was determining the value of a company’s intangible property, which in many cases were its brands. I went to law school intending to study trademarks and the new field of “internet law” (this term probably dates me) and started my legal career at Testa, Hurwitz & Thibeault, which had a cutting-edge trademark and open source group.

JP: We typically think of IP and Trademark law as it applies to consumer products and commercial entities. What is the difference between those and when open source projects and organizations use brands?

DS:  On one level, there really isn’t a difference.  A trademark signifies the unique source of a good or service. Trademarks help consumers, developers, and users distinguish various offerings, and they communicate the specific source and quality of those offerings.  Software developers and users need to understand what code they have and where it came from. Trademarks help communicate that information.  Of course, the specific issues that every brand and brand owner faces and how they address them are different, but many of the core principles are the same.

JP: What are some of the trademark issues you’ve seen come up in open source communities?

DS: While it happens in every industry, I see many “helpful” people apply to register projects’ trademarks when they are not the rightful owner.  Sometimes they have good intentions, sometimes not, but it can be a lot of work to sort it out either way.  I’ve also had the opportunity to work with many different people and companies on project branding. It is amazing how many different philosophies there are regarding branding, even within the software industry.  Much of what we do is to bring these folks together to determine the best approach for the specific project.  I also spend a lot of time debating the scope of trademark rights with opposing counsel, but that isn’t really unique to open source:  one lawyer tried to convince me that his client had the exclusive right to use a picture of a hop flower on a beer label. 

Other common issues are helping companies register a mark for their company or product and then used the same mark for an open source project. The neutrality of those situations is imbalanced, and the Linux Foundation has worked with organizations making this transition. Sometimes it involves rebranding the open source project, and we assist in finding and clearing a new name for the community to use independent of the company that started it.

JP: Why is the Linux Foundation a good place for open source projects to protect their brands?

DS: We have worked with many open source projects on their trademarks, and we learn something with every new experience.  We can help them name the project at the beginning, take steps to protect their trademarks across the globe, and show them how trademarks can be a tool to build their communities and increase participation and adoption.  We also recognize the importance of our neutral position in the industries we serve and how that is fundamental to open governance.

Also Read: Open Source Communities and Trademarks: A Reprise

JP: Trademark conformance can also protect a project from technical drift. How can a trademark conformance program be used to encourage conformance with a project’s code base or interfaces? 

DS: Great point. As in most areas of trademarks, clarity and consistency are key. Trademarks used in a conformance program can be a great tool to communicate quickly and accurately to the target community.  Projects can develop specific and transparent criteria so that users understand exactly what the conformance trademark symbolizes.  This can be much more effective and efficient for projects and users alike than everyone deciding for themselves what a term like “compatible” might mean.  

Also Read: Driving Compatibility with Code and Specifications through Conformance Trademark Programs

JP: Do projects at the Linux Foundation give up all control of their trademark? How do you decide what enforcement to pursue or not pursue?

DS: On the contrary — we work very closely with project leadership throughout the lifecycle of their trademarks.  This includes trademark enforcement.  Typically, the first step is to figure out whether the situation requires enforcement (in the traditional legal sense) or if it is simply a matter of educating another party.  More often than not, we can reach out to the other party, discuss our project and our trademarks, discuss our concerns, and work out a solution that works for everyone and strengthens our brands.  But like any brand owner, we do sometimes have to take other action to protect our projects’ trademarks, and we work closely with our projects in those situations, too.

JP: Thanks, Daniel. It’s been great talking to you today!

The post Interview with Daniel Scales, Chief Brand Counsel, Linux Foundation appeared first on Linux Foundation.

Super Blueprints Integrate the 5G Open Source Stack from Core to Door

Tue, 06/01/2021 - 23:03

There is an exciting convergence in the networking industry around open source, and the energy is palpable. At LF Networking, we have a unique perspective as the largest open source initiative in the networking space with the broadest set of projects that make up the diverse and evolving open source networking stack. LF Networking provides platforms and building blocks across the networking industry that enable rapid interoperability, deployment, and adoption and is the nexus for 5G innovation and integration. 

LF Networking has now tapped confluence on industry efforts to structure a new initiative to develop 5G Super Blueprints for the ecosystem. Major integrations between the building blocks are now underway–between ONAP and ORAN, Akraino and Magma, Anuket and Kubernetes, and more. 

“Super” means that we’re integrating multiple projects, umbrellas (such as LF Edge, Magma, CNCF, O-RAN Alliance, LF Energy, and more) with an end-to-end framework for the underlying infrastructure and application layers across edge, access, and core. This end-to-end integration enables top industry use cases, such as fixed wireless, mobile broadband, private 5G, multi-access, IoT, voice services, network slicing, and more. In short, 5G Super Blueprints are a vehicle to collaborate and create end-to-end 5G solutions.

Major industry verticals banking on this convergence and roadmap include the global telcos that you’d expect, but 5G knows no boundaries, and we’re seeing deep engagement from cloud service providers, enterprise IT, governments, and even energy.

5G is poised to modernize today’s energy grid with awareness monitoring across Distribution Systems and more.

This will roll out in 3 phases, the first encompassing 5G Core + Multi-access Edge Computing (MEC) using emulators. The second phase introduces commercial RANs to end-to-end 5G, and the third phase will integrate Open Radio Access Network (O-RAN). 

The 5G Super Blueprint is an open initiative, and participation is open to anyone. To learn more, please see the 5G Super Blueprint FAQ and watch the video, What is the 5G Super Blueprint? from Next Gen Infra

Participation in this group has tripled over the last few weeks! If you’re ready to join us, please indicate your interest in participation on the 5G Super Blueprint webpage, and follow the onboarding steps on the 5G Super Blueprint Wiki. Send any questions to superblueprint@lfnetworking.org

The post Super Blueprints Integrate the 5G Open Source Stack from Core to Door appeared first on Linux Foundation.

The Linux Foundation joins Accenture, GitHub, Microsoft, and ThoughtWorks to Launch the Green Software Foundation to put sustainability at the core of software engineering

Wed, 05/26/2021 - 04:44

As we think about the future of the software industry, we believe we have a responsibility to help build a better future – a more sustainable future – both internally at our organizations and in partnership with industry leaders around the globe. With data centers around the world accounting for 1% of global electricity demand, and projections to consume 3-8% in the next decade, it’s imperative we address this as an industry.


To help in that endeavor, we’re excited to announce the formation of The Green Software Foundation – a nonprofit founded by Accenture, GitHub, Microsoft, and ThoughtWorks established with the Linux Foundation and the Joint Development Foundation Projects LLC to build a trusted ecosystem of people, standards, tooling, and leading practices for building green software.

Read more at The Microsoft Blog

The post The Linux Foundation joins Accenture, GitHub, Microsoft, and ThoughtWorks to Launch the Green Software Foundation to put sustainability at the core of software engineering appeared first on Linux Foundation.

SPDX: It’s Already in Use for Global Software Bill of Materials (SBOM) and Supply Chain Security

Wed, 05/26/2021 - 04:28

Author: Kate Stewart, VP of Dependable Systems, The Linux Foundation

In a previous Linux Foundation blog, David A. Wheeler, director of LF Supply Chain Security, discussed how capabilities built by Linux Foundation communities can be used to address the software supply chain security requirements set by the US Executive Order on Cybersecurity. 

One of those capabilities, SPDX, completely addresses the Executive Order 4(e) and 4(f) and 10(j) requirements for a Software Bill of Materials (SBOM). The SPDX specification is implemented as a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components. 

SPDX is an open standard for communicating software bill of material (SBOM) information, including components, licenses, copyrights, and security references. It has a rich ecosystem of existing tools that provides a common format for companies and communities to share important data to streamline and improve the identification and monitoring of software.

SBOMs have numerous use cases. They have frequently been used in areas such as license compliance but are equally useful in security, export control, and broader processes such as mergers and acquisitions (M&A) processes or venture capital investments. SDPX maintains an active community to support various uses, modeling its governance and activity on the same format that has successfully supported open source software projects over the past three decades.

The LF has been developing and refining SPDX for over ten years and has seen extensive uptake by companies and projects in the software industry.  Notable recent examples are the contributions by companies such as Hitachi, Fujitsu, and Toshiba in furthering the standard via optional profiles like “SPDX Lite” in the SPDX 2.2 specification release and in support of the SPDX SBOMs in proprietary and open source automation solutions. 

This de facto standard has been submitted to ISO via the Joint Development Foundation using the PAS Transposition process of Joint Technical Committee 1 (JTC1). It is currently in the enquiry phase of the process and can be reviewed on the ISO website as ISO/IEC DIS 5962.

There is a wide range of open source tooling, as well as commercial tool options emerging as well as options available today.  Companies such as FOSSID and Synopsys have been working with the SPDX format for several years. Open Source tools like FOSSology (source code Analysis),  OSS Review Toolkit (Generation from CI & Build infrastructure), Tern (container content analysis), Quartermaster (build extensions), ScanCode (source code analysis) in addition to the SPDX-tools project have also standardized on using SPDX for the interchange are also participating in Automated Compliance Tooling (ACT) Project Umbrella.  ACT has been discussed as community-driven solutions for software supply chain security remediation as part of our synopsis of the findings in the Vulnerabilities in the Core study, which was published by the Linux Foundation and Harvard University LISH in February of 2020.   

One thing is clear: A software bill of materials that can be shared without friction between different teams and companies will be a core part of software development and deployment in this coming decade. The sharing of software metadata will take different forms, including manual and automated reviews, but the core structures will remain the same. 

Standardization in this field, as in others, is the key to success. This domain has an advantage in that we are benefiting from an entire decade of prior work in SPDX. Therefore the process becomes the implementation of this standard to the various domains rather than the creation, expansion, or additional refinement of new or budding approaches to the matter.

Start using the SPDX specification here:https://spdx.github.io/spdx-spec/. Development of the next revision is underway, so If there’s a use case you can’t represent with the current specification, open an issue, this is the right window for input.   

To learn more about the many facets of the SPDX project see: https://spdx.dev/

The post SPDX: It’s Already in Use for Global Software Bill of Materials (SBOM) and Supply Chain Security appeared first on Linux Foundation.

How LF communities enable security measures required by the US Executive Order on Cybersecurity

Fri, 05/14/2021 - 22:30

Our communities take security seriously and have been instrumental in creating the tools and standards that every organization needs to comply with the recent US Executive Order

Overview

The US White House recently released its Executive Order (EO) on Improving the Nation’s Cybersecurity (along with a press call) to counter “persistent and increasingly sophisticated malicious cyber campaigns that threaten the public sector, the private sector, and ultimately the American people’s security and privacy.”

In this post, we’ll show what the Linux Foundation’s communities have already built that support this EO and note some other ways to assist in the future. But first, let’s put things in context.

The Linux Foundation’s Open Source Security Initiatives In Context

We deeply care about security, including supply chain (SC) security. The Linux Foundation is home to some of the most important and widely-used OSS, including the Linux kernel and Kubernetes. The LF’s previous Core Infrastructure Initiative (CII) and its current Open Source Security Foundation (OpenSSF) have been working to secure OSS, both in general and in widely-used components. The OpenSSF, in particular, is a broad industry coalition “collaborating to secure the open source ecosystem.”

The Software Package Data Exchange (SPDX) project has been working for the last ten years to enable software transparency and the exchange of software bill of materials (SBOM) data necessary for security analysis. SPDX is in the final stages of review to be an ISO standard, is supported by global companies with massive supply chains, and has a large open and closed source tooling support ecosystem. SPDX already meets the requirements of the executive order for SBOMs.

Finally, several LF foundations have focused on the security of various verticals. For example,  LF Public Health and LF Energy have worked on security in their respective sectors.

Given that context, let’s look at some of the EO statements (in the order they are written) and how our communities have invested years in open collaboration to address these challenges.

Best Practices

The EO 4(b) and 4(c) says that

The “Secretary of Commerce [acting through NIST] shall solicit input from the Federal Government, private sector, academia, and other appropriate actors to identify existing or develop new standards, tools, and best practices for complying with the standards, procedures, or criteria [including] criteria that can be used to evaluate software security, include criteria to evaluate the security practices of the developers and suppliers themselves, and identify innovative tools or methods to demonstrate conformance with secure practices [and guidelines] for enhancing software supply chain security.” Later in EO 4(e)(ix) it discusses “attesting to conformity with secure software development practices.”

The OpenSSF’s CII Best Practices badge project specifically identifies best practices for OSS, focusing on security and including criteria to evaluate the security practices of developers and suppliers (it has over 3,800 participating projects). LF is also working with SLSA (currently in development) as potential additional guidance focused on addressing supply chain issues further.

Best practices are only useful if developers understand them, yet most software developers have never received education or training in developing secure software. The LF has developed and released its Secure Software Development Fundamentals set of courses available on edX to anyone at no cost. The OpenSSF Best Practices Working Group (WG) actively works to identify and promulgate best practices. We also provide a number of specific standards, tools, and best practices, as discussed below.

Encryption and Data Confidentiality

The EO 3(d) requires agencies to adopt “encryption for data at rest and in transit.” Encryption in transit is implemented on the web using the TLS (“https://”) protocol, and Let’s Encrypt is the world’s largest certificate authority for TLS certificates.

In addition, the LF Confidential Computing Consortium is dedicated to defining and accelerating the adoption of confidential computing. Confidential computing protects data in use (not just at rest and in transit) by performing computation in a hardware-based Trusted Execution Environment. These secure and isolated environments prevent unauthorized access or modification of applications and data while in use.

Supply Chain Integrity

The EO 4(e)(iii) states a requirement for

 “employing automated tools, or comparable processes, to maintain trusted source code supply chains, thereby ensuring the integrity of the code.” 

The LF has many projects that support SC integrity, in particular:

  • in-toto is a framework specifically designed to secure the integrity of software supply chains.
  • The Update Framework (TUF) helps developers maintain the security of software update systems, and is used in production by various tech companies and open source organizations.  
  • Uptane is a variant of TUF; it’s an open and secure software update system design which protects software delivered over-the-air to the computerized units of automobiles.
  • sigstore is a project to provide a public good / non-profit service to improve the open source software supply chain by easing the adoption of cryptographic software signing (of artifacts such as release files and container images) backed by transparency log technologies (which provide a tamper-resistant public log). 
  • OpenChain (ISO 5230) is the International Standard for open source license compliance. Application of OpenChain requires identification of OSS components. While OpenChain by itself focuses more on licenses, that identification is easily reused to analyze other aspects of those components once they’re identified (for example, to look for known vulnerabilities).

Software Bill of Materials (SBOMs) support supply chain integrity; our SBOM work is so extensive that we’ll discuss that separately.

Software Bill of Materials (SBOMs)

Many cyber risks come from using components with known vulnerabilities. Known vulnerabilities are especially concerning in key infrastructure industries, such as the national fuel pipelines,  telecommunications networks, utilities, and energy grids. The exploitation of those vulnerabilities could lead to interruption of supply lines and service, and in some cases, loss of life due to a cyberattack.

One-time reviews don’t help since these vulnerabilities are typically found after the component has been developed and incorporated. Instead, what is needed is visibility into the components of the software environments that run these key infrastructure systems, similar to how food ingredients are made visible.

A Software Bill of Materials (SBOM) is a nested inventory or a list of ingredients that make up the software components used in creating a device or system. This is especially critical as it relates to a national digital infrastructure used within government agencies and in key industries that present national security risks if penetrated. Use of SBOMs would improve understanding of the operational and cyber risks of those software components from their originating supply chain.

The EO has extensive text about requiring a software bill of materials (SBOM) and tasks that depend on SBOMs:

  • EO 4(e) requires providing a purchaser an SBOM “for each product directly or by publishing it on a public website” and “ensuring and attesting… the integrity and provenance of open source software used within any portion of a product.” 
  • It also requires tasks that typically require SBOMs, e.g., “employing automated tools, or comparable processes, that check for known and potential vulnerabilities and remediate them, which shall operate regularly….” and “maintaining accurate and up-to-date data, provenance (i.e., origin) of software code or components, and controls on internal and third-party software components, tools, and services present in software development processes, and performing audits and enforcement of these controls on a recurring basis.” 
  • EO 4(f) requires publishing “minimum elements for an SBOM,” and EO 10(j) formally defines an SBOM as a “formal record containing the details and supply chain relationships of various components used in building software…  The SBOM enumerates [assembled] components in a product… analogous to a list of ingredients on food packaging.”

The LF has been developing and refining SPDX for over ten years; SPDX is used worldwide and has is in the process of being approved as ISO/IEC Draft International Standard (DIS) 5962.  SPDX is a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components. SPDX 2.2 already supports the current guidance from the National Telecommunications and Information Administration (NTIA) for minimum SBOM elements. Some ecosystems have ecosystem-specific conventions for SBOM information, but SPDX can provide information across all arbitrary ecosystems.

SPDX is real and in use today, with increased adoption expected in the future. For example:

  • An NTIA “plugfest” demonstrated ten different producers generating SPDX. SPDX supports acquiring data from different sources (e.g., source code analysis, executables from producers, and analysis from third parties). 
  • A corpus of some LF projects with SPDX source SBOMs is available. 
  • Various LF projects are working to generate binary SBOMs as part of their builds, including yocto and Zephyr
  • To assist with further SPDX adoption, the LF is paying to write SPDX plugins for major package managers.
Vulnerability Disclosure

No matter what, some vulnerabilities will be found later and need to be fixed. EO 4(e)(viii) requires “participating in a vulnerability disclosure program that includes a reporting and disclosure process.” That way, vulnerabilities that are found can be reported to the organizations that can fix them. 

The CII Best Practices badge passing criteria requires that OSS projects specifically identify how to report vulnerabilities to them. More broadly, the OpenSSF Vulnerability Disclosures Working Group is working to help “mature and advocate well-managed vulnerability reporting and communication” for OSS. Most widely-used Linux distributions have a robust security response team, but the Alpine Linux distribution (widely used in container-based systems) did not. The Linux Foundation and Google funded various improvements to Alpine Linux, including a security response team.

We hope that the US will update its Vulnerabilities Equities Process (VEP) to work more cooperatively with commercial organizations, including OSS projects, to share more vulnerability information. Every vulnerability that the US fails to disclose is a vulnerability that can be found and exploited by attackers. We would welcome such discussions.

Critical Software

It’s especially important to focus on critical software — but what is critical software? EO 4(g) requires the executive branch to define “critical software,” and 4(h) requires the executive branch to “identify and make available to agencies a list of categories of software and software products… meeting the definition of critical software.”

Linux Foundation and the Laboratory for Innovation Science at Harvard (LISH) developed the report Vulnerabilities in the Core,’ a Preliminary Report and Census II of Open Source Software, which analyzed the use of OSS to help identify critical software. The LF and LISH are in the process of updating that report. The CII identified many important projects and assisted them, including OpenSSL (after Heartbleed), OpenSSH,  GnuPG, Frama-C, and the OWASP Zed Attack Proxy (ZAP). The OpenSSF Securing Critical Projects Working Group has been working to better identify critical OSS projects and to focus resources on critical OSS projects that need help. There is already a first-cut list of such projects, along with efforts to fund such aid.

Internet of Things (IoT)

Unfortunately, internet-of-things (IoT) devices often have notoriously bad security. It’s often been said that “the S in IoT stands for security.” 

EO 4(s) initiates a pilot program to “educate the public on the security capabilities of Internet-of-Things (IoT) devices and software development practices [based on existing consumer product labeling programs], and shall consider ways to incentivize manufacturers and developers to participate in these programs.” EO 4(t) states that such “IoT cybersecurity criteria” shall “reflect increasingly comprehensive levels of testing and assessment.”

The Linux Foundation develops and is home to many of the key components of IoT systems. These include:

  • The Linux kernel, used by many IoT devices. 
  • The yocto project, which creates custom Linux-based systems for IoT and embedded systems. Yocto supports full reproducible builds. 
  • EdgeX Foundry, which is a flexible OSS framework that facilitates interoperability between devices and applications at the IoT edge, and has been downloaded millions of times. 
  • The Zephyr project, which provides a real-time operating system (RTOS) used by many for resource-constrained IoT devices and is able to generate SBOM’s automatically during build. Zephyr is one of the few open source projects that is a CVE Numbering Authority.
  • The seL4 microkernel, which is the most assured operating system kernel in the world; it’s notable for its comprehensive formal verification.
Security Labeling

EO 4(u) focuses on identifying:

“secure software development practices or criteria for a consumer software labeling program [that reflects] a baseline level of secure practices, and if practicable, shall reflect increasingly comprehensive levels of testing and assessment that a product may have undergone [and] identify, modify, or develop a recommended label or, if practicable, a tiered software security rating system.”

The OpenSSF’s CII Best Practices badge project (noted earlier) specifically identifies best practices for OSS development, and is already tiered (passing, silver, and gold). Over 3,800 projects currently participate.

There are also a number of projects that relate to measuring security and/or broader quality:

Conclusion

The Linux Foundation (LF) has long been working to help improve the security of open source software (OSS), which powers systems worldwide. We couldn’t do this without the many contributions of time, money, and other resources from numerous companies and individuals; we gratefully thank them all.  We are always delighted to work with anyone to improve the development and deployment of open source software, which is important to us.

David A. Wheeler, Director of Open Source Supply Chain Security at the Linux Foundation

The post How LF communities enable security measures required by the US Executive Order on Cybersecurity appeared first on Linux Foundation.

Hyperledger Announces 2021 Brand Study

Wed, 05/12/2021 - 21:48

The debate is no longer about deploying blockchain technology, but rather about building production networks that will scale and interoperate. In 2020, the focus shifted from proving the value of blockchain to scaling, governance, and managing blockchain networks. COVID-19 has given the digitization of trust-based processes a new urgency, driving more profound interest in identity, interoperability, and supply chain use cases. 

Together with Linux Foundation Research, Hyperledger is conducting a survey to measure the market awareness and perceptions of Hyperledger and its projects, specifically identifying myths and misperceptions. Additionally, the survey seeks to help Hyperledger articulate the perceived time to production readiness for products and understand motivations for developers that both use and contribute to Hyperledger technologies.

Hyperledger is an open source collaborative effort created to advance cross-industry blockchain technologies. It is a global collaboration including participation from leaders in finance, banking, healthcare, supply chains, manufacturing, and technology. 

  • Please participate now; we intend to close the survey in early June. 
  • Privacy and confidentiality are important to us. Neither participant names, nor their company names, will be displayed in the final results. 
  • This survey should take no more than 20 minutes of your time.

To take the 2021 Hyperledger Market Survey, click the button below:

Take Survey (EN) Take Survey (調査) Take Survey (民意调查)

Thanks to our survey partner Linux Foundation Japan.

SURVEY GOALS

Thank you for taking the time to participate in this survey conducted by Hyperledger, an open source project at the Linux Foundation focused on developing a suite of stable frameworks, tools, and libraries for enterprise-grade blockchain deployments.

Hyperledger and its affiliated projects are hosted by the Linux Foundation.

This survey will provide insights into the challenges, familiarity, and misconceptions about Hyperledger and its suite of technologies. We hope these insights will help guide us in the growth and expansion of marketing and recruitment efforts to help grow projects and our community.

This survey will provide insights into:

  • What is the awareness, familiarity, and understanding of Hyperledger overall and by project?
  • What are the myths and misperceptions of Hyperledger (e.g., around what it seeks to achieve  (e.g., the number of projects, who is involved and who the competitors are)?
  • How likely are respondents to purchase or adopt blockchain technology?
  • What is the appeal of joining the Hyperledger community?
  • What are the perceptions of business blockchain technology?
  • What is the perceived time to production readiness?
  • What are developers’ motivations for contributing to /using Hyperledger?
PRIVACY

Your name and company name will not be displayed. Reviews are attributed to your role, company size, and industry. Responses will be subject to the Linux Foundation’s Privacy Policy, available at https://linuxfoundation.org/privacy. Please note that members of the Hyperledger survey committee who are not LF employees will review the survey results. 

VISIBILITY

We will summarize the survey data and share the findings during the Hyperledger Member Summit later in the year. The summary report will be published on the Hyperledger and Linux Foundation websites. In addition, we will be producing an in-depth report of the survey which will be shared with Hyperledger membership.

QUESTIONS

If you have questions regarding this survey, please email us at survey@hyperledger.org

Sign up for the Hyperledger Newsletter at https://hyperledger.org 

The post Hyperledger Announces 2021 Brand Study appeared first on Linux Foundation.

Open Source API Gateway KrakenD Becomes Linux Foundation Project

Tue, 05/11/2021 - 23:00

KrakenD framework becomes the Lura Project and gets home at Linux Foundation where it will be the only enterprise-grade API Gateway hosted in a neutral, open forum

SAN FRANCISCO, May 11, 2021 – The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced it is hosting the Lura Project, formerly the KrakenD open source project. Lura is a framework for building Application Programming Interfaces (API) Gateways that goes beyond simple reverse proxy, functioning as an aggregator for many microservices and is a declarative tool for creating endpoints. 

Partners include 99P Labs (backed by Ohio State University), Ardan Studios, Hepsiburada, Openroom, Postman, Skalena and Stayforlong. 

“By being hosted at the Linux Foundation, the Lura Project will extend the legacy of the KrakenD open source framework and be better poised to support its massive adoption among more than one million servers every month,” said Albert Lombarte, CEO, KrakenD. “The Foundation’s open governance model will accelerate development and community support for this amazing success.”

API Gateways have become even more valuable as the necessary fabric for connecting cloud applications and services in hybrid environments. KrakenD was created five years ago as a library for engineers to create fast and reliable API Gateways. It has been in production among some of the world’s largest Internet businesses since 2016 As the Lura Project, it is a stateless, distributed, high-performance API Gateway that enables microservices adoption. 

“The Lura Project is an essential connection tissue for applications and services across open source cloud projects and so it’s a natural decision to host it at the Linux Foundation,” said Mike Dolan, senior vice president and general manager of Projects at the Linux Foundation. “We’re looking forward to providing the open governance structure to support Lura Project’s massive growth.” 

For more information about the Lura Project, please visit: https://www.luraproject.org

Supporting Comments

Ardan Studios

“I’m excited to hear that KrakenD API Gateway is being brought into the family of open source projects managed by the Linux Foundation. I believe this shows the global community the commitment KrakenD has to keeping their technology open source and free to use. With the adoption that already exists, and this new promise towards the future, I expect amazing things for the product and the community around it,” said William Kennedy, Managing Partner at Ardan Studios.

Hepsiburada

“At Hepsiburada we have a massive amount of traffic and a complex ecosystem of around 500 microservices and different datacenters. Adding KrakenD to our Kubernetes clusters has helped us reduce the technical and organizational challenges of dealing with a vast amount of resources securely and easily. We have over 800 containers running with KrakenD and looking forward to having more,” said Alper Hankendi, Engineering Director Hepsiburada.

Openroom

“KrakenD allowed us to focus on our backend and deploy a secure and performant system in a few days. After more than 2 years of use in production and 0 crash or malfunction, it also has proven its robustness,” said Jonathan Muller, CTO Openroom Inc.

Postman

“KrakenD represents a renaissance of innovation and investment in the API gateway and management space by challenging the established players with a more lightweight, high performance, and modern gateway for API publisher to put to work across their API operations, while also continuing to establish the LInux Foundation as the home for open API specifications and tooling that are continuing to touch and shape almost every business sector today,” said Kin Lane, chief evangelist, Postman.

Stayforlong

“KrakenD makes it easier for us to manage authentication, filter bots, and integrate our apps. It has proved to be stable and reliable since day one. It is wonderful!” said Raúl M. Sillero, CTO Stayforlong.com.

Skalena

“The Opensource model always was a great proof of innovation and nowadays a synonym of high-quality products and incredible attention with the real needs from the market (Customer Experience). The Linux Foundation is one of the catalysts of incredible solutions, and KrakenD and now Lura would not have a better place to be. With this move, I am sure that it is a start of a new era for this incredible solution in the API Gateway space,  the market will be astonished by a lot of good things about to come,” said Edgar Silva, founder and partner at Skalena. 

About The Linux Foundation

Founded in 2000, The Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. The Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

###

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page:  https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Media Contact

Jennifer Cloer
for the Linux Foundation
503-867-2304
jennifer@storychangesculture.com

The post Open Source API Gateway KrakenD Becomes Linux Foundation Project appeared first on Linux Foundation.

The Linux Foundation and NGMN Collaborate on End-to-End 5G and Beyond

Tue, 05/11/2021 - 00:00

SAN FRANCISCO, Calif.  and FRANKFURT, GERMANY – May 10, 2021 – The Linux Foundation and the Next Generation Mobile Network Alliance (NGMN), today announce the signing of a Memorandum of Understanding (MoU) for formal collaboration regarding end-to-end 5G and beyond. 

NGMN’s mission is to provide impactful industry guidance to achieve innovative and affordable mobile telecommunication services for the end user, placing a particular focus on Mastering the Route to Disaggregation, Sustainability and Green Future Networks, as well as on 6G and the continuous support of 5G’s full implementation.

Creating and providing open, scalable building blocks for operators and service providers is critical to the industry adoption of 5G and beyond. Therefore, the collaboration between NGMN and the Linux Foundation will focus on end-to-end 5G architecture and beyond 5G. Specific areas of alignment may include sustainability, network automation and network autonomy based on Artificial Intelligence, security, edge cloud, virtualization, disaggregation, cloud native, and service-based architecture, to name a few. 

“We very much look forward to a mutually inspiring and beneficial collaboration with The Linux Foundation. Open Source is gaining increasing relevance for the strategic topics of our Work Programmes such as Mastering the Route to Disaggregation, Green Future Networks and 6G. We are delighted to partner with The Linux Foundation to jointly drive our mission for the benefit of the global ecosystem”, said Anita Doehler, CEO, NGMN Alliance.

“We are thrilled to be aligning with such an innovative, industry-leading organization,” said Arpit Joshipura, General Manager, Networking, Edge and IoT, the Linux Foundation. “Integrating NGMN’s expertise across pivotal areas like Disaggregation, Green Future Networks, cloud native, automation, and early work on 6G into LF Networking’s 5G Super Blueprint initiative is a natural next step for the industry.”

The Linux Foundation’s vision of harmonizing open source software with open standards has been in effect for several years, including collaborations with ETSI, TMF, MEF, GSMA, the O-RAN Alliance, and more. NGMN also maintains longstanding co-operations with all of these organisations. The alignment between The Linux Foundation and NGMN represents the latest in a long-standing effort to integrate open source and open standards across the industry. 

About NGMN

About NGMN Alliance (www.ngmn.org)

The NGMN Alliance (Next Generation Mobile Networks Alliance) is a forum founded by world-leading Mobile Network Operators and open to all partners in the mobile industry. Its goal is to ensure that next generation network infrastructure, service platforms and devices will meet the requirements of operators and, ultimately, will satisfy end user demand and expectations. The vision of the NGMN Alliance is to provide impactful industry guidance to achieve innovative and affordable mobile telecommunication services for the end user with a particular focus on supporting 5G’s full implementation, Mastering the Route to Disaggregation, Sustainability and Green Networks, and work on 6G.

NGMN seeks to incorporate the views of all interested stakeholders in the telecommunications industry and is open to three categories of participants (NGMN Partners): Mobile Network Operators (Members), vendors, software companies and other industry players (Contributors), as well as research institutes (Advisors).

About the Linux Foundation

The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

# # #

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

The post The Linux Foundation and NGMN Collaborate on End-to-End 5G and Beyond appeared first on Linux Foundation.

Interview with Masato Endo, OpenChain Project Japan

Mon, 05/10/2021 - 23:37

Linux Foundation Editorial Director Jason Perlow had a chance to speak with Masato Endo, OpenChain Project Automotive Chair and Leader of the OpenChain Project Japan Work Group Promotion Sub Group, about the Japan Ministry of Economy, Trade and Industry’s (METI) recent study on open source software management.

JP: Greetings, Endo-san! It is my pleasure to speak with you today. Can you tell me a bit about yourself and how you got involved with the Japan Ministry of Economy, Trade, and Industry?

遠藤さん、こんにちは!本日はお話しできることをうれしく思います。あなた自身について、また経済産業省とどのように関わっていますか。

ME: Hi, Jason-san! Thank you for such a precious opportunity. I’m a manager and scrum master in the planning and development department of new services at a Japanese automotive company. We were also working on building the OSS governance structure of the company, including obtaining OpenChain certification.

As an open source community member, I participated in the OpenChain project and was involved in establishing the OpenChain Japan Working Group and Automotive Working Group. Recently, as a leader of the Promotion SG of the Japan Working Group, I am focusing on promoting OSS license compliance in Japan.

In this project, I contribute to it as a bridge between the Ministry of Economic, Trade, and Industry and the members of OSS community projects such as OpenChain.

For example, I recently gave a presentation of OpenChain at the meeting and introduced the companies that cooperate with the case study.

Jasonさん、こんにちは。このような貴重な機会をありがとうございます。

私は、自動車メーカーの新サービスの企画・開発部署でマネージャーやスクラムマスターを務めています。また、OpenChain認証取得等の会社のオープンソースガバナンス体制構築についても取り組んでいました。

一方、コミュニティメンバーとしてもOpenChainプロジェクトに参加し、OpenChain Japan WGやAutomotive WGの設立に関わりました。最近では、Japan WGのPromotion SGのリーダーとして日本におけるOSSライセンスコンプライアンスの啓発活動に注力しています。

今回のプロジェクトにおいては、経済産業省のタスクフォースとOpenChainとの懸け橋として、ミーティングにてOpenChainの活動を紹介させて頂いたり、ケーススタディへの協力企業を紹介させて頂いたりすることで、コントリビューションさせて頂きました。

JP: What does the Ministry of Economy, Trade, and Industry (METI) do?

経済産業省(METI)はどのような役割の役所ですか?

ME: METI has jurisdiction over the administration of the Japanese economy and industry. This case study was conducted by a task force that examines software management methods for ensuring cyber-physical security of the Commerce and Information Policy Bureau’s Cyber Security Division.

経済産業省は経済や産業に関する行政を所管しています。今回のケーススタディは商務情報政策局サイバーセキュリティ課によるサイバー・フィジカル・セキュリティ確保に向けたソフトウェア管理手法等検討タスクフォースにより実施されたものです。

JP: Why did METI commission a study on the management of open source program offices and open source software management at Japanese companies?

なぜ経済産業省は、日本企業のオープンソースプログラムオフィスの管理とオープンソースソフトウェアの管理に関する調査を実施したのですか?

ME: METI itself conducted this survey. The Task Force has been considering appropriate software management methods, vulnerability countermeasures, license countermeasures, and so on.

Meanwhile, as the importance of OSS utilization has increased in recent years, it concluded that sharing the knowledge of each company regarding OSS management methods helps solve each company’s problems.

今回の調査は、METIが主体的に行ったものです。タスクフォースは適切なソフトウェアの管理手法、脆弱性対応やライセンス対応などについて検討してきました。

そんな中、昨今のOSS利活用の重要性が高まる中、OSSの管理手法に関する各企業の知見の共有が各社の課題解決に有効だという結論に至りました。

JP: How do Japanese corporations differ from western counterparts in open source culture? 

日本の企業は、オープンソース文化において欧米の企業とどのように違いますか?

ME: Like Western companies, Japanese companies also use OSS in various technical fields, and OSS has become indispensable. In addition, more than 80 companies have participated in the Japan Working Group of the OpenChain project. As a result, the momentum to promote the utilization of OSS is increasing in Japan.

On the other hand, some survey results show that Japanese companies’ contribution process and support system are delayed compared to Western companies. So, it is necessary to promote community activities in Japan.

欧米の企業と同様、日本の企業でもOSSは様々な技術領域で使われており、欠かせないものになっています。また、OpenChainプロジェクトのJPWGに80社以上の企業が参加するなど、企業としてOSSの利活用を推進する機運も高まってきています。

一方で、欧米企業と比較するとコントリビューションのプロセスやサポート体制の整備が遅れているという調査結果も出ているため、コミュニティ活動を促進する仕組みをより強化していく必要があると考えられます。

JP: What are the challenges that the open source community and METI have identified due to the study that Japanese companies face when adopting open source software within their organizations? 

日本企業が組織内でオープンソースソフトウェアを採用する際に直面する調査の結果、オープンソースコミュニティと経済産業省が特定した課題は何ですか?

ME: In this case study, many companies mentioned license compliance. It was found that each company has established a company-wide system and rules to comply with the license and provides education to engineers. The best way to do this depends on the industry and size of the company, but I believe the information from this case study is very useful for each company of all over the world.

In addition, it was confirmed that Software Bill of Materials (SBOM) is becoming more critical for companies in the viewpoint of both vulnerability response and license compliance. Regardless of whether companies are using OSS internally or exchanging software with an external partner, it’s important to clarify which OSS they are using. I recognize that this issue is a hot topic as “Software transparency” in Western companies as well.

In this case study, several companies also mentioned OSS supply chain management. In addition to clarifying the rules between companies, it is characterized by working to raise the level of the entire supply chain through community activities such as OpenChain.

今回のケーススタディでは、多くの企業がライセンスコンプライアンスに言及していました。各企業はライセンスを遵守するために、全社的な体制やルールを整え、エンジニアに対してライセンス教育を実施していることがわかりました。ベストな方法は産業や企業の規模によっても異なりますが、各社の情報はこれからライセンスコンプライアンスに取り組もうとしている企業やプロセスの改善を進めている企業にとって非常に有益なものであると私は考えます。

また、脆弱性への対応、ライセンスコンプライアンスの両面から、企業にとってSBOMの重要性が高まっていることが確認できました。社内でOSSを利用する場合であっても、社外のパートナーとソフトウエアをやりとりする場合であっても、どのOSSを利用しているかを明確にすることが最重要だからです。この課題はソフトウエアの透過性といって欧米でも話題になっているものであると私は認識しています。

このケーススタディの中で複数の企業がOSSのサプライチェーンマネジメントについても言及していました。企業間でのルールを明確化する他、OpenChainなどのコミュニティ活動によって、サプライチェーン全体のレベルアップに取り組むことが特徴になっています。

Challenge 1: License compliance

When developing software using OSS, it is necessary to comply with the license declared by each OSS. If companies don’t conduct in-house licensing education and management appropriately, OSS license violations will occur.

Challenge 2: Long term support

Since the development term of OSS depends on the community’s activities, the support term may be shorter than the product life cycle in some cases.

Challenge 3:OSS supply chain management

Recently, the software supply chain scale has expanded, and there are frequent cases where OSS is included in deliveries from suppliers. OSS information sharing in the supply chain has become important to implement appropriate vulnerability countermeasures and license countermeasures.

Challenge 1: ライセンスコンプライアンス

OSSを利用してソフトウエアを開発する場合は、各OSSが宣言しているライセンスを遵守する必要があります。社内におけるライセンスに関する教育や管理体制が不十分な場合、OSSライセンスに違反してしまう可能性があります。 

Challenge 2: ロングタームサポート

OSSの開発期間はコミュニティの活性度に依存するため、場合によっては製品のライフサイクルよりもサポート期間が短くなってしまう可能性があります。

Challenge 3: サプライチェーンにおけるOSSの使用

最近はソフトウエアサプライチェーンの規模が拡大しており、サプライヤからの納品物にOSSが含まれるケースも頻繁に起こっています。適切な脆弱性対応、ライセンス対応などを実施するため、サプライチェーンの中でのOSSの情報共有が重要になってきています。

JP: What are the benefits of Japanese companies adopting standards such as OpenChain and SPDX?

OpenChainやSPDXなどの標準を採用している日本企業のメリットは何ですか?

ME: Companies need to do a wide range of things to ensure proper OSS license compliance, so some guidance is needed. The OpenChain Specification, which has become an ISO as a guideline for that, is particularly useful. In fact, several companies that responded to this survey have built an OSS license compliance process based on the OpenChain Specification.

Also, from the perspective of supply chain management, it is thought that if each supply chain company obtains OpenChain certification, software transparency will increase, and appropriate OSS utilization will be promoted.

In addition, by participating in OpenChain’s Japan Working Group, companies can share the best practices of each company and work together to solve problems.

Since SPDX is a leading international standard for SBOM, it is very useful to use it when exchanging information about OSS in the supply chain from the viewpoint of compatibility.

Japanese companies use the SPDX standard and actively contribute to the formulation of SPDX specifications like SPDX Lite.

企業がOSSライセンスコンプライアンスを適切に行うために行うべきことは多岐に渡るために何かしらの指針が必要です。そのための指針としてISOになったOpenChain Specificationは非常に有用なものです。実際、今回の調査に回答した複数の企業がOpenChain Specificationに基づいてOSSライセンスコンプライアンスプロセスを構築し、認証を取得しています。

また、サプライチェーンマネジメントの観点からも、サプライチェーン各社がOpenChain認証を取得することで、ソフトウエアの透過性が高まり、適切なOSSの利活用を促進されると考えられます。

更にOpenChainのJPWGに参加することで、各社のベストプラクティスを共有したり、協力して課題解決をすることもできます。

SPDXは重要性の高まっているSBOMの有力な国際標準であるため、サプライチェーン内でOSSに関する情報を交換する場合に、SPDXを利用することは互換性等の観点から非常に有益です。

日本企業はSPDXの標準を利用するだけではなく、SPDX LiteのようにSPDXの使用策定にも積極的にコントリビューションしています。

JP: Thank you, Endo-san! It has been great speaking with you today.

遠藤さん、ありがとうございました!本日は素晴らしい議論になりました。

The post Interview with Masato Endo, OpenChain Project Japan appeared first on Linux Foundation.

‘Master,’ ‘Slave’ and the Fight Over Offensive Terms in Computing (Kate Conger, New York Times, April 13, 2021)

Fri, 05/07/2021 - 01:14

Nearly a year after the Internet Engineering Task Force took up a plan to replace words that could be considered racist, the debate is still raging.

Anyone who joined a video call during the pandemic probably has a global volunteer organization called the Internet Engineering Task Force to thank for making the technology work. The group, which helped create the technical foundations of the internet, designed the language that allows most video to run smoothly online. It made it possible for someone with a Gmail account to communicate with a friend who uses Yahoo, and for shoppers to safely enter their credit card information on e-commerce sites.

Now the organization is tackling an even thornier issue: getting rid of computer engineering terms that evoke racist history, like “master” and “slave” and “whitelist” and “blacklist.”

But what started as an earnest proposal has stalled as members of the task force have debated the history of slavery and the prevalence of racism in tech. Some companies and tech organizations have forged ahead anyway, raising the possibility that important technical terms will have different meanings to different people — a troubling proposition for an engineering world that needs broad agreement so technologies work together.

While the fight over terminology reflects the intractability of racial issues in society, it is also indicative of a peculiar organizational culture that relies on informal consensus to get things done.

The Internet Engineering Task Force eschews voting, and it often measures consensus by asking opposing factions of engineers to hum during meetings. The hums are then assessed by volume and ferocity. Vigorous humming, even from only a few people, could indicate strong disagreement, a sign that consensus has not yet been reached.

The I.E.T.F. has created rigorous standards for the internet and for itself. Until 2016, it required the documents in which its standards are published to be precisely 72 characters wide and 58 lines long, a format adapted from the era when programmers punched their code into paper cards and fed them into early IBM computers.

“We have big fights with each other, but our intent is always to reach consensus,” said Vint Cerf, one of the founders of the task force and a vice president at Google. “I think that the spirit of the I.E.T.F. still is that, if we’re going to do anything, let’s try to do it one way so that we can have a uniform expectation that things will function.”

The group is made up of about 7,000 volunteers from around the world. It has two full-time employees, an executive director and a spokesman, whose work is primarily funded by meeting dues and the registration fees of dot-org internet domains. It cannot force giants like Amazon or Apple to follow its guidance, but tech companies often choose to do so because the I.E.T.F. has created elegant solutions for engineering problems.

Its standards are hashed out during fierce debates on email lists and at in-person meetings. The group encourages participants to fight for what they believe is the best approach to a technical problem.

While shouting matches are not uncommon, the Internet Engineering Task Force is also a place where young technologists break into the industry. Attending meetings is a rite of passage, and engineers sometimes leverage their task force proposals into job offers from tech giants.

In June, against the backdrop of the Black Lives Matter protests, engineers at social media platforms, coding groups and international standards bodies re-examined their code and asked themselves: Was it racist? Some of their databases were called “masters” and were surrounded by “slaves,” which received information from the masters and answered queries on their behalf, preventing them from being overwhelmed. Others used “whitelists” and “blacklists” to filter content.

Mallory Knodel, the chief technology officer at the Center for Democracy and Technology, a policy organization, wrote a proposal suggesting that the task force use more neutral language. Invoking slavery was alienating potential I.E.T.F. volunteers, and the terms should be replaced with ones that more clearly described what the technology was doing, argued Ms. Knodel and the co-author of her proposal, Nielsten Oever, a postdoctoral researcher at the University of Amsterdam. “Blocklist” would explain what a blacklist does, and “primary” could replace “master,” they wrote.

On an email list, responses trickled in. Some were supportive. Others proposed revisions. And some were vehemently opposed. One respondent wrote that Ms. Knodel’s draft tried to construct a new “Ministry of Truth.”

Amid insults and accusations, many members announced that the battle had become too toxic and that they would abandon the discussion.

The pushback didn’t surprise Ms. Knodel, who had proposed similar changes in 2018 without gaining traction. The engineering community is “quite rigid and averse to these sorts of changes,” she said. “They are averse to conversations about community comportment, behavior — the human side of things.”

In July, the Internet Engineering Task Force’s steering group issued a rare statement about the draft from Ms. Knodel and Mr. ten Oever. “Exclusionary language is harmful,” it said.

A month later, two alternative proposals emerged. One came from Keith Moore, an I.E.T.F. contributor who initially backed Ms. Knodel’s draft before creating his own. His cautioned that fighting over language could bottleneck the group’s work and argued for minimizing disruption.

The other came from Bron Gondwana, the chief executive of the email company Fastmail, who said he had been motivated by the acid debate on the mailing list.

“I could see that there was no way we would reach a happy consensus,” he said. “So I tried to thread the needle.”

Mr. Gondwana suggested that the group should follow the tech industry’s example and avoid terms that would distract from technical advances.

Last month, the task force said it would create a new group to consider the three drafts and decide how to proceed, and members involved in the discussion appeared to favor Mr. Gondwana’s approach. Lars Eggert, the organization’s chair and the technical director for networking at the company NetApp, said he hoped guidance on terminology would be issued by the end of the year.

The rest of the industry isn’t waiting. The programming community that maintains MySQL, a type of database software, chose “source” and “replica” as replacements for “master” and “slave.” GitHub, the code repository owned by Microsoft, opted for “main” instead of “master.”

In July, Twitter also replaced a number of terms after Regynald Augustin, an engineer at the company, came across the word “slave” in Twitter’s code and advocated change.

But while the industry abandons objectionable terms, there is no consensus about which new words to use. Without guidance from the Internet Engineering Task Force or another standards body, engineers decide on their own. The World Wide Web Consortium, which sets guidelines for the web, updated its style guide last summer to “strongly encourage” members to avoid terms like “master” and “slave,” and the IEEE, an organization that sets standards for chips and other computing hardware, is weighing a similar change.

Other tech workers are trying to solve the problem by forming a clearinghouse for ideas about changing language.

That effort, the Inclusive Naming Initiative, aims to provide guidance to standards bodies and companies that want to change their terminology but don’t know where to begin.

The group got together while working on an open-source software project, Kubernetes, which like the I.E.T.F. accepts contributions from volunteers. Like many others in tech, it began the debate over terminology last summer.

“We saw this blank space,” said Priyanka Sharma, the general manager of the Cloud Native Computing Foundation, a nonprofit that manages Kubernetes. Ms. Sharma worked with several other Kubernetes contributors, including Stephen Augustus and Celeste Horgan, to create a rubric that suggests alternative words and guides people through the process of making changes without causing systems to break. Several major tech companies, including IBM and Cisco, have signed on to follow the guidance.


Priyanka Sharma and several other tech workers in the Inclusive Naming Initiative came up
with a rubric to suggest alternative words

Although the Internet Engineering Task Force is moving more slowly, Mr. Eggert said it would eventually establish new guidelines. But the debate over the nature of racism — and whether the organization should weigh in on the matter — has continued on its mailing list.

In a subversion of an April Fools’ Day tradition within the group, several members submitted proposals mocking diversity efforts and the push to alter terminology in tech.

Two prank proposals were removed hours later because they were “racist and deeply disrespectful,” Mr. Eggert wrote in an email to task force participants, while a third remained up.

“We build consensus the hard way, so to speak, but in the end the consensus is usually stronger because people feel their opinions were reflected,” Mr. Eggert said. “I wish we could be faster, but on topics like this one that are controversial, it’s better to be slower.”

Kate Conger is a technology reporter in the San Francisco bureau, where she covers the gig economy and social media. @kateconger

The post ‘Master,’ ‘Slave’ and the Fight Over Offensive Terms in Computing (Kate Conger, New York Times, April 13, 2021) appeared first on Linux Foundation.

Open Mainframe Project Launches Call for Proposals for the 2nd Annual Open Mainframe Summit on September 22-23

Wed, 05/05/2021 - 23:00

Registration for the Virtual Event is now Open

SAN FRANCISCO, May 5, 2021 The Open Mainframe Project (OMP), an open source initiative that enables collaboration across the mainframe community to develop shared tool sets and resources, today announced plans for its 2nd annual Open Mainframe Summit, the premier mainframe event of 2021. The event, set for September 22-23, is open to students, developers, users and contributors of Open Mainframe projects from around the globe looking to learn, network and collaborate. As a virtual event again this year, Open Mainframe Summit will feature content tracks that tackle both business and technical strategies for enterprise development and deployment.

In Open Mainframe Project’s inaugural event last year, more than 380 registrants from 175 companies joined the two-day conference that featured 36 sessions. Some of the most popular sessions were the Women in Tech panel, COBOL sessions, new mainframer journey and project overview sessions for Ambitus, Feilong, Polycephaly, and Zowe. The event report can be found here and all of the videos can be watched here.

“Open Mainframe Project is becoming the gateway to all educational tools and initiatives that run some of the world’s biggest enterprise systems,” said John Mertic, Director of Program Management at the Linux Foundation. “For our inaugural event last year, we merely dipped our toes in the water as a new summit. This year, we’ll see more change makers speaking about open source innovation, creativity and diversity in mainframe related technologies. We look forward to igniting conversations that are going to positively impact all facets of mainframes.”

Call for Proposals

The Call for Proposals is now open and will be accepting submissions until July 16, 2021. Interested speakers can submit proposals in five tracks such as business overview, Linux on Z, z/OS, education and training and diversity, equity and inclusion. Options for presentations include lightning talks, 30-minute sessions and panel discussions.

A program committee, which will include maintainers, active community members and project leaders, will review and rate the proposals once all the submissions are in. This year, Open Mainframe Project welcomes Greg MacKinnon, Distinguished Engineer at Broadcom, Inc; Joe Winchester, Technical Staff Member at IBM; Kimberly Andersson, Director of Experience Design at Rocket Software; Stacey Miller, Product Marketing Manager at SUSE; and Harry Williams, Chief Technology Officer at Marist College as the 2021 Open Mainframe Summit program committee.

Submit a proposal here: https://events.linuxfoundation.org/open-mainframe-summit/program/cfp/.

Whether a company is a member or contributor of Open Mainframe Project or is sponsoring the event has no impact on whether talks from their developers will be selected. However, being a community leader does have an impact, as program committee members will often rate talks from the creators or leaders of an open source project more highly. Focus on work with an open source project, whether it is one of the Open Mainframe Project’s 18 hosted projects or working groups that adds value to the ecosystem.

Conference Registration for the online event is $50 for general attendance and $15 for academia. Registration is now open, click here to register.

Thank you Sponsors

Open Mainframe Summit is made possible with support from our Platinum Sponsors Broadcom Mainframe Software, Rocket Software, and SUSE; our Gold Sponsor Vicom Infinity; and our Academic and Community Sponsors CD Foundation and the Fintech Open Source Foundation (FINOS). To become a sponsor, click here.

For more about Open Mainframe Project, visit https://www.openmainframeproject.org/

About the Open Mainframe Project

The Open Mainframe Project is intended to serve as a focal point for deployment and use of Linux and Open Source in a mainframe computing environment. With a vision of Open Source on the Mainframe as the standard for enterprise class systems and applications, the project’s mission is to build community and adoption of Open Source on the mainframe by eliminating barriers to Open Source adoption on the mainframe, demonstrating value of the mainframe on technical and business levels, and strengthening collaboration points and resources for the community to thrive. Learn more about the project at https://www.openmainframeproject.org.

About The Linux Foundation

The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

###

The post Open Mainframe Project Launches Call for Proposals for the 2nd Annual Open Mainframe Summit on September 22-23 appeared first on Linux Foundation.

Linux Foundation Launches Open Source Digital Infrastructure Project for Agriculture, Enables Global Collaboration Among Industry, Government and Academia

Wed, 05/05/2021 - 21:00

AgStack Foundation will build and sustain the global data infrastructure for food and agriculture to help scale digital transformation and address climate change, rural engagement and food and water security

SAN FRANCISCO, Calif., May 5, 2021 –  The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the launch of the AgStack Foundation, the open source digital infrastructure project for the world’s agriculture ecosystem. AgStack Foundation will improve global agriculture efficiency through the creation, maintenance and enhancement of free, reusable, open and specialized digital infrastructure for data and applications.

Founding members and contributors include leaders from both the technology and agriculture industries, as well as across sectors and geographies. Members and partners include Agralogics, Call for Code, Centricity Global, Digital Green, Farm Foundation, farmOS, HPE, IBM, Mixing Bowl & Better Food Ventures, NIAB, OpenTeam, Our Sci, Produce Marketing Association, Purdue University / OATS & Agricultural Informatics Lab, the University of California Agriculture and Natural Resources (UC-ANR) and University of California Santa Barbara SmartFarm Project.

“The global Agriculture ecosystem desperately needs a digital makeover. There is too much loss of productivity and innovation due to the absence of re-usable tools and data. I’m excited to lead this community of leaders, contributors and members – from across sectors and countries – to help build this common and re-usable resource – AgStack – that will help every stakeholder in global agriculture with free and open digital tools and data,” said Sumer Johal, Executive Director of AgStack.

Thirty-three percent of all food produced is wasted, while nine percent of the people in the world are hungry or undernourished. These societal drivers are compounded with legacy technology systems that are too slow and inefficient and can’t work across the growing and more complex agricultural supply chain. AgStack will use collaboration and open source software to build the 21st century digital infrastructure that will be a catalyst for innovation on new applications, efficiencies and scale.

AgStack consists of an open repository to create and publish models, free and easy access to public data, interoperable frameworks for cross-project use and topic-specific extensions and toolboxes. It will leverage existing technologies such as agriculture standards (AgGateway, UN-FAO, CAFA, USDA and NASA-AR); public data (Landsat, Sentinel, NOAA and Soilgrids; models (UC-ANR IPM), and open source projects like Hyperledger, Kubernetes, Open Horizon, Postgres, Django and more.

“We’re pleased to provide the forum for AgStack to be built and to grow,” said Mike Dolan, general manager and senior vice president of projects at the Linux Foundation. “It’s clear that by using open source software to standardize the digital infrastructure for agriculture, that AgStack can reduce cost, accelerate integration and enable innovation. It’s amazing to see industries like agriculture use open source principles to innovate.”

For more information about AgStack, please visit: http://www.agstack.org

Member/Partner Statements

Call for Code

“Through Call for Code and IBM’s tech-for-good programs, we’ve seen amazing grassroots innovation created by developers who build solutions to address local farming issues that affect them personally,” said Daniel Krook, IBM CTO for Call for Code. “As thriving, sustainable open source projects hosted at the Linux Foundation, applications like Agrolly and Liquid Prep have access to a strong ecosystem of partners and will be able to accelerate their impact through a shared framework of open machine learning models, data sets, libraries, message formats, and APIs such as those provided by AgStack.”

Centricity Global

“Interoperability means working together and open source has proven to be the most practical means of doing so. Centricity Global looks forward to bringing our teams, tools and applications to the AgStack community and to propelling projects that deliver meaningful value long-term,” said Drew Zabrocki, Centricity Global. “Now is the time to get things done. The docking concept at AgStack is a novel way to bring people and technology together under a common, yet sovereign framework; I see great potential for facilitating interoperability and data sovereignty in a way that delivers tangible value on the farm forward across the supply value chain.”

Digital Green

“The explosion of agri-tech innovations from large companies to startups to governments to non-profits represents a game changer for farmers in both the Global South and North.  At the same time, it’s critical that we build digital infrastructure that ensures that the impact of these changes enables the aspirations of those most marginalized and builds their resilience, particularly in the midst of climate change. We’re excited about joining hands with AgStack with videos produced by & for farmers and FarmStack, a secure data sharing protocol, that fosters community and trust and puts farmers back in the center of our food & agricultural system,” said Rikin Gandhi, Co-founder and Executive Director.

Farm Foundation

“The advancements in digital agriculture over the past 10 years have led to more data than ever before—data that can be used to inform business decisions, improve supply and demand planning and increase efficiencies across stakeholders. However, the true potential of all that data won’t be fully realized without achieving interoperability via an open source environment. Interoperable data is more valuable data, and that will lead to benefits for farmers and others throughout the food and ag value chain,” said Martha King, Vice President of Programs and Projects, Farm Foundation.

farmOS

“AgStack’s goal of creating a shared community infrastructure for agricultural datasets, models, frameworks, and tools fills a much-needed gap in the current agtech software landscape. Making these freely available to other software projects allows them to focus on their unique value and build upon the work of others. We in the farmOS community are eager to leverage these shared resources in the open source record keeping tools we are building together,” said Michael Stenta, founder and lead developer, farmOS.

HPE

“The world’s food supply needs digital innovation that currently faces challenges of adoption due to the lack of a common, secure, community-maintained digital infrastructure. AgStack – A Linux Foundation’s Project, is creating this much needed open source digital infrastructure for accelerating innovation. We at Hewlett Packard Enterprise are excited about contributing actionable insights and learnings to solve data challenges that this initiative can provide and we’re committed to its success!” said Janice Zdankus, VP, Innovation for Social Impact, Office of the CTO, Hewlett Packard Enterprise.

Mixing Bowl & Better Food Ventures

“There are a lot of people talking about interoperability; it is encouraging to see people jump in to develop functional tools to make it happen. We share the AgStack vision and look forward to collaborating with the community to enable interoperability at scale,” said Rob Trice, Partner, The Mixing Bowl & Better Food Ventures.

NIAB

“Climate change is a global problem and agriculture needs to do its part to reduce greenhouse gas emissions during all stages of primary production. This requires digital innovation and a common, global, community-maintained digital infrastructure to create the efficient, resilient, biodiverse and low-emissions food production systems that the world needs. These systems must draw on the best that precision agriculture has to offer and aligned innovations in crop science, linked together through open data solutions. AgStack – A Linux Foundation Project, is creating this much needed open-source digital infrastructure for accelerating innovation. NIAB are excited to join this initiative and will work to develop a platform that brings together crop and data science at scale. As the UK’s fastest growing, independent crop research organization NIAB provides crop science, agronomy and data science expertise across a broad range of arable and horticultural crops,” said Dr Richard Harrison, Director of NIAB Cambridge Crop Research.

OpenTEAM

“Agriculture is a shared human endeavor and global collaboration is necessary to translate our available knowledge into solutions that work on the ground necessary to adapt and mitigate climate change, improve livelihoods, and biodiversity as well as the produce of abundant food fiber and energy.  Agriculture is at the foundation of manufacture and commerce and AgStack represents a collaborative effort at a scale necessary to meet the urgency of the moment and unlock our shared innovative capacity through free, reusable, open digital infrastructure.  OpenTEAM is honored to join with the mission to equip producers with tools that both support data sovereignty for trusted transactions while also democratizing site specific agricultural knowledge regardless of scale, culture or geography,” said Dr. Dorn Cox, project lead and founder of Open Technology Ecosystem for Agricultural Management and research director for Wolfe’s Neck Center for Agriculture & the Environment.

Our Sci

“AgStack provides a framework for a scalable base of open source software, and the shared commitment to keep it alive and growing.  We’re excited to see it succeed!” said Greg Austic, owner, Our Sci.

Produce Marketing Association

“The digitization of data will have tremendous benefits for the Fresh Produce and Floral industry in the areas of traceability, quality management, quality prediction and other efficiencies through supply chain visibility. The key is challenges to adoption is interoperability and the development of a common, community-maintained digital infrastructure. I am confident that AgStack – A Linux Foundation’s Project, can create this much needed open-source digital infrastructure for accelerating innovation. We at Produce Marketing Association are excited about this initiative and we are committed to its success,” said Ed Treacy, VP of Supply Chain and Sustainability.

Purdue University

“We need fundamental technical infrastructure to enable open innovation in agriculture, including ontologies, models, and tools. Through the AgStack Project, the Linux Foundation will provide valuable cohesion and development capacity to support shared, community-maintained infrastructure. At the Agricultural Informatics Lab, we’re committed to enabling resilience food and agricultural systems through deliberate design and development of such infrastructure,” said Ankita Raturi, Assistant Professor, Agricultural Informatics Lab, Purdue University.

“True interoperability requires a big community and we’re excited to see the tools that we’ve brought to the open-source ag community benefiting new audiences.  OATS Center at Purdue University looks forward to docking the Trellis Framework for supply chain, market access and regulatory compliance through AgStack for the benefit of all,” said Aaron Ault, Co-Founder OATS Center at Purdue University.

UC Davis

“Translating 100+ years of UC agricultural research into usable digital software and applications is a critical goal in the UC partnership with the AgStack open source community. We are excited about innovators globally using UC research and applying it to their local crops through novel digital technologies,” said Glenda Humiston, VP of Agriculture and Natural Resources, University of California.

“Artificial Intelligence and Machine Learning are critical to food and agriculture transformation, and will require new computational models and massive data sets to create working technology solutions from seed to shelf. The AI Institute for Next Generation Food Systems is excited to partner with the AgStack open source community to make our work globally available to accelerate the transformation,” said Ilias Tagkopoulos, Professor, Computer Science at UC Davis and Director, AI Institute of Next Generation Food Systems.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

###

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page:  https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Media Contact

Jennifer Cloer
for Linux Foundation
503-867-2304
jennifer@storychangesculture.com

The post Linux Foundation Launches Open Source Digital Infrastructure Project for Agriculture, Enables Global Collaboration Among Industry, Government and Academia appeared first on Linux Foundation.

The Linux Foundation Announces Open Source Summit + Embedded Linux Conference 2021 Will Move From Dublin, Ireland to Seattle, Washington

Wed, 04/28/2021 - 08:40
  • Calls for Speaking Proposals close June 13
  • OSPOCon and Linux Security Summit will also move to Seattle
  • All events will take place September 27 – October 1

SAN FRANCISCO, April 27, 2021The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced today that Open Source Summit + Embedded Linux Conference 2021, along with Linux Security Summit and OSPOCon, will take place in Seattle, Washington, USA, from September 27 – October 1. 

Earlier in the year, it was announced that instead of separate North America and Europe editions of Open Source Summit + Embedded Linux Conference (OS Summit + ELC), only one would be held in 2021, located in Dublin, Ireland. The decision to move these events from Dublin, Ireland to Seattle, Washington, USA, has been made due to the current state of vaccination rates in Europe and upon review of past attendee survey results regarding where and when they would feel comfortable traveling this year.  

OS Summit + ELC will be held in a hybrid format, with both in-person and virtual offerings, to ensure that everyone who wants to participate is able to do so.

KVM Forum, which was also scheduled to take place in Dublin, will now be a virtual event taking place September 15 -16. New details on Linux Plumbers Conference and Linux Kernel Maintainer Summit, also previously scheduled in Dublin, will be announced shortly. A second OSPOCon – OSPOCon Europe, will be held in London on October 6, 2021, with more details coming soon.

Registration for all events will open in June, after more details on local regulations and venue safety plans are available. 

Calls for Speaking Proposals
The Call for Speaking Proposals for OS Summit + ELC and OSPOCon are open through Sunday, June 13 at 11:59pm PDT.  Interested community members are encouraged to apply here. Speakers will be able to speak in person or remotely. 

Linux Security Summit’s Call for Proposals is open through Sunday, June 27 at 11:59pm PDT.  Applications are being accepted here.

Sponsorships
Sponsorships are available for all events. Benefits include speaking opportunities, prominent branding, opportunities to support diversity and inclusion, lead generation activities, event passes, and more. View the sponsorship prospectus here or email us to learn more.  

Open Source Summit + Embedded Linux Conference 2021 is made possible thanks to Diamond Sponsors IBM and Red Hat, Platinum Sponsor Huawei and Gold Sponsor Soda Foundation, among others. For information on becoming an event sponsor, click here.

OSPOCon is presented by The Linux Foundation and the TODO Group and is made possible by Host Sponsors Eclipse Foundation and Huawei, and Supporter Sponsor Sauce Labs. For information on becoming an event sponsor, click here

Linux Security Summit is made possible by General Sponsor Technology Innovation Institute, and Supporter Sponsors IBM and Indeed. For information on becoming a sponsor, click here

Members of the press who would like to request a media pass should contact Kristin O’Connell at koconnell@linuxfoundation.org

About The Linux Foundation
Founded in 2000, The Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects are critical to the world’s infrastructure, including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

Follow The Linux Foundation on TwitterFacebook, and LinkedIn for all the latest news, event updates and announcements.

The Linux Foundation Events are where the world’s leading technologists meet, collaborate, learn and network in order to advance innovations that support the world’s largest shared technologies.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

####

Media Contact:

Kristin O’Connell
The Linux Foundation
koconnell@linuxfoundation.org

The post The Linux Foundation Announces Open Source Summit + Embedded Linux Conference 2021 Will Move From Dublin, Ireland to Seattle, Washington appeared first on Linux Foundation.

The Linux Foundation Hosts Forum to Share Linux Stories for 30th Anniversary

Thu, 04/22/2021 - 22:00

Linux community to share personal stories of how Linux has impacted their lives, thirty submissions to be highlighted for anniversary.

SAN FRANCISCO, April 22, 2021The Linux Foundation, the nonprofit organization enabling mass innovation through open source, is recognizing World Penguin Day, May 25, by kicking off a global campaign to find out how Linux has most impacted people’s lives.  Everyone is invited to share. Thirty submissions will be randomly selected and highlighted in celebration of the 30th Anniversary of Linux, occurring this year.

In addition, The Linux Foundation will adopt thirty penguins, the animal synonymous with Linux, from the Southern African Foundation for the Conservation of Coastal Birds. Each of the thirty randomly chosen submitters will be able to name one of the adopted penguins and will have a certificate and picture of the penguin sent to them to mark this momentous time in Linux’s history as well as Linux’s impact on their own life. 

“Linux has changed the world and created innovation in incredibly diverse ways,” said Angela Brown, SVP & GM Events, The Linux Foundation.  “It has also had a huge impact on individuals’ lives. Open source is fundamentally about community, and we want to hear directly from the community about how Linux has impacted them personally. We can’t wait to hear stories from around the world, and more importantly, we look forward to sharing these stories and hope they inspire more people to join the community for the next 30 years of innovation and beyond.”

Those who would like to submit can do so here. Submissions are being accepted through May 9. The highlighted submissions will be selected in June and showcased in a blog post on events.linuxfoundation.org and on The Linux Foundation’s social media channels. Submitters chosen will also be notified before then by email.


About The Linux Foundation
Founded in 2000, The Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects are critical to the world’s infrastructure, including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

Follow The Linux Foundation on Twitter, Facebook, and LinkedIn for all the latest news, event updates and announcements.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage.

Linux is a registered trademark of Linus Torvalds.

####

Media Contact:

Kristin O’Connell
The Linux Foundation
koconnell@linuxfoundation.org

The post The Linux Foundation Hosts Forum to Share Linux Stories for 30th Anniversary appeared first on Linux Foundation.

Interview with Jory Burson, Community Director, OpenJS Foundation on Open Source Standards

Thu, 04/22/2021 - 21:00

Jason Perlow, Editorial Director of the Linux Foundation, chats with Jory Burson, Community Director at the OpenJS Foundation about open standardization efforts and why it is important for open source projects.

JP: Jory, first of all, thanks for doing this interview. Many of us know you from your work at the OpenJS Foundation, the C2PA, and on open standards, and you’re also involved in many other open community collaborations. Can you tell us a bit about yourself and how you got into working on Open Standards at the LF?

JB: While I’m a relatively new addition to the Linux Foundation, I have been working with the OpenJS foundation for probably three years now — which is hosted by the Linux Foundation. As some of your readers may know, OpenJS is home to several very active JavaScript open source projects, and many of those maintainers are really passionate about web standards. Inside that community, we’ve got a core group of about 20 people participating actively at Ecma International on the JavaScript TCs, the W3C, the Unicode Consortium, the IETF, and some other spaces, too. What we wanted to do was create this space where those experts can get together, discuss things in a cross-project sort of way, and then also help onboard new people into this world of web standards — because it can be a very intimidating thing to try and get involved in from the outside. 

The Joint Development Foundation is something I’m new to, but as part of that, I’m very excited to get to support the C2PA, which stands for Coalition for Content Provenance and Authenticity; it’s a new effort as well. They’re going to be working on standards related to media provenance and authenticity — to battle fakes and establish trustworthiness in media formats, so I’m very excited to get to support that project as it grows.

JP: When you were at Bocoup, which was a web engineering firm, you worked a lot with international standards organizations such as Ecma and W3C, and you were in a leadership role at the TC53 group, which is JavaScript for embedded systems. What are the challenges that you faced when working with organizations like that? 

JB: There are the usual challenges that I think face any international or global team, such as coordination of meeting times and balancing the tension between asynchronously conducting business via email lists, GitHub, and that kind of thing. And then more synchronous forms of communication or work, like Slack and actual in-person meetings. Today, we don’t really worry as much about the in-person meetings, but still, there’s like, this considerable overhead of, you know, “human herding” problems that you have to overcome. 

Another challenge is understanding the pace at which the organization you’re operating in really moves. This is a complaint we hear from many people new to standardization and are used to developing projects within their product team at a company. Even within an open source project, people are used to things moving perhaps a bit faster and don’t necessarily understand that there are actually built-in checks in the process — in some cases, to ensure that everybody has a chance to review, everybody has an opportunity to comment fairly, and that kind of thing. 

Sometimes, because that process is something that’s institutional knowledge, it can be surprising to newcomers in the committees — so they have to learn that there’s this other system that operates at an intentionally different pace. And how does that intersect with your work product? What does that mean for the back timing of your deliverables? That’s another category of things that is “fun” to learn. It makes sense once you’ve experienced it, but maybe running into it for the first time isn’t quite as enjoyable.

JP: Why is it difficult to turn something like a programming language into an internationally accepted standard? In the past, we’ve seen countless flavors of C and Pascal and things like that.

JB: That’s a really good question. I would posit that programming languages are some of the easier types of standards to move forward today because the landscape of what that is and the use cases are fairly clear. Everybody is generally aware of the concept that languages are ideally standardized, and we all agree that this is how this language should work. We’re all going to benefit, and none of us are necessarily, outside of a few cases, trying to build a market in which we’re the dominant player based solely on a language. In my estimation, that tends to be an easier case to bring lots of different stakeholders to the table and get them to agree on how a language should proceed. 

In some of the cases you mentioned, as with C, and Pascal, those are older languages. And I think that there’s been a shift in how we think about some of those things, where in the past it was much more challenging to put a new language out there and encourage adoption of that language, as well as a much higher bar and much more difficult sort of task in getting people information out about how that language worked. 

Today with the internet, we have a very easy distribution system for how people can read, participate, and weigh in on a language. So I don’t think we’re going to see quite as many variations in standardized languages, except in some cases where, for example, with JavaScript, TC53 is carving out a subset library of JavaScript, which is optimized for sensors and lower-powered devices. So long story short, it’s a bit easier, in my estimation, to do the language work. Where I think it gets more interesting and difficult is actually in some of the W3C communities where we have standardization activities around specific web API’s you have to make a case for, like, why this feature should actually become part of the platform versus something experimental…

JP: … such as for Augmented Reality APIs or some highly specialized 3D rendering thing. So what are the open standardization efforts you are actively working on at the LF now, at this moment?

JB: At this exact moment, I am working with the OpenJS Foundation standards working group, and we’ve got a couple of fun projects that we’re trying to get off the ground. One is creating a Learning Resource Center for people who want to learn more about what standardization activities really look like, what they mean, some of the terminologies, etc. 

For example, many people say that getting involved in open source is overwhelming — it’s daunting because there’s a whole glossary of things you might not understand. Well, it’s the same for standardization work, which has its own entire new glossary of things. So we want to create a learning space for people who think they want to get involved. We’re also building out a feedback system for users, open source maintainers, and content authors. This will help them say, “here’s a piece of feedback I have about this specific proposal that may be in front of a committee right now.”

So those are two things. But as I mentioned earlier, I’m still very new to the Linux Foundation. And I’m excited to see what other awesome standardization activities come into the LF.

JP: Why do you feel that the Linux Foundation now needs to double down its open standards efforts? 

JB: One of the things that I’ve learned over the last several years working with different international standards organizations is that they have a very firm command of their process. They understand the benefits of why and how a standard is made, why it should get made, those sorts of things. However, they don’t often have as strong a grasp as they ought to around how the software sausage is really made. And I think the Linux Foundation, with all of its amazing open source projects, is way closer to the average developer and the average software engineer and what their reality is like than some of these international standards developing boards because the SDOs are serving different purposes in this grander vision of ICT interoperability. 

On the ground, we have, you know, the person who’s got to build the product to make sure it’s fit for purpose, make sure it’s conformant, and they’ve got to make it work for their customers. In the policy realm, we have these standardization folks who are really good at making sure that the policy fits within a regulatory framework, is fair and equitable and that everybody’s had a chance to bring concerns to the table — which the average developer may not have time to be thinking about privacy or security or whatever it might be. So the Linux Foundation and other open source organizations need to fit more of the role of a bridge-builder between these populations because they need to work together to make useful and interoperable technologies for the long term. 

That’s not something that one group can do by themselves. Both groups want to make that happen. And I think it’s really important that the LF demonstrate some leadership here.

JP: Is it not enough to make open software projects and get organizations to use them? Or are open standards something distinctly different and separate from open source software?

JB: I think I’ll start by saying there are some pretty big philosophical differences in how we approach a standard versus an open source project. And I think the average developer is pretty comfortable with the idea that version 1.0 of an open source project may not look anything like version 2.0. There are often going to be cases and examples where there are breaking changes; there’s stuff that they shouldn’t necessarily rely on in perpetuity, and that there’s some sort of flex that they should plan for in that kind of thing.

The average developer has a much stronger sense with a standardization activity that those things should not change. And should not change dramatically in a short period. JavaScript is a good example of a language that changes every year; new features are added. But there aren’t breaking changes; it’s backward compatible. There are some guarantees in terms of a standard platform’s stability versus an open source platform, for example. And further, we’re developing more of a sense of what’s a higher bar, if you will, for open standards activities, including the inclusion of things like test suites, documentation, and the required number of reference implementations examples.

Those are all concepts that are kind of getting baked into the idea of what makes a good standard. There’s plenty of standards out there that nobody has ever even implemented — people got together and agreed how something should work and then never did anything with it. And that’s not the kind of standard we want to make or the kind of thing we want to promote. 

But if we point to examples like JavaScript — here’s this community we have created, here’s the standard, it’s got this great big group of people who all worked on it together openly and equitably. It’s got great documentation, it’s got a test suite that accompanies it — so you can run your implementation against that test suite and see where the dragons lie. And it’s got some references and open source reference implementations that you can view.  

Those sorts of things really foster a sense of trustworthiness in a standard — it gives you a sense that it’s something that’s going to stick around for a while, perhaps longer than an open source project, which may be sort of the beginnings of a standardization activity. It may be a reference to implementing a standard, or some folks just sort of throwing spaghetti at a wall and trying to solve a problem together. And I think these are activities that are very complementary with each other. It’s another great reason why other open source projects and organizations should be getting involved and supporting standardization activities.

JP: Do open standardization efforts make a case for open source software even stronger? 

I think so — I just see them as so mutually beneficial, right? Because in the case of an open standards activity, you may be working with some folks and saying, well, here’s what I’m trying to express what this would look like — if we take the prose — and most of the time, the standard is written in prose and a pseudocode sort of style. It’s not something you can feed into the machine and have it work. So the open source projects, and polyfills, and things of that sort can really help a community of folks working on a problem say, “Aha, I understand what you mean!” “This is how we interpreted this, but it’s producing some unintended behaviors”, or “we see that this will be hard to test, or we see that this creates a security issue.”

It’s a way of putting your ideas down on paper, understanding them together, and having a tool through which everybody can pull and say, Okay, let’s, let’s play with it and see if this is really working for what we need it for.”

Yes, I think they’re very compatible.

JP: Like peanut butter and jelly.

JB: Peanut butter and jelly. Yeah.

JP: I get why large organizations might want things like programming languages, APIs, and communications protocols to be open standards, but what are the practical benefits that average citizens get from establishing open standards? 

JB: Open standards really help promote innovation and market activity for all players regardless of size. Now, granted, for the most part, a lot of the activities we’ve been talking about are funded by some bigger players. You know, when you look at the member lists of some of the standards bodies, it’s larger companies like the IBMs, Googles, and Microsofts of the world, the companies that provide a good deal more of the funding. Still, hundreds of small and midsize businesses are also benefiting from standards development. 

You mentioned my work at Bocoup earlier — that’s another great example. We were a consulting firm, who heavily benefited from participating in and leveraging open standards to help build tools and software for our customers. So it is a system that I think helps create an equitable market playing field for all the parties. It’s one of those actual examples of rising tides, which lift all boats if we’re doing it in a genuinely open and pro-competitive way. Now, sometimes, that’s not always the case. In other types of standardization areas, that’s not always true. But certainly, in our web platform standards, that’s been the case. And it means that other companies and other content authors can build web applications, websites, services, digital products, that kind of thing. Everybody benefits — whether those people are also Microsoft customers, Google customers, and all that. So it’s an ecosystem.

JP: I think it’s great that we’ve seen companies like Microsoft that used to have much more closed systems embrace open standards over the last ten years or so. If you look at the first Internet Explorer they ever had out — there once were websites that only worked on that browser. Today, the very idea of a website that only works on one company’s web browser correctly is ridiculous, right? We now have open source engines that these browsers use that embrace open standards have become much more standardized. So I think that open standards have helped some of these big companies that were more closed become more open. We even see it happen at companies like Apple. They use the Bluetooth protocol to connect to their audio hardware and have adopted technologies such as the USB-C connector when previously, they were using weird proprietary connectors before. So they, too, understand that open standards are a good thing. So that helps the consumer, right? I can go out and buy a wireless headset, and I know it’ll work because it uses the Bluetooth protocol. Could you imagine if we had nine different types of wireless networking instead of WiFi? You wouldn’t be able to walk into a store and buy something and know that it would work on your network. It would be nuts. Right?

JB: Absolutely. You’re pointing to hardware and the standards for physical products and goods versus digital products and goods in your example. So in using that example, do you want to have seven different adapters for something? No, it causes confusion and frustration in the marketplace. And the market winner is the one who’s going to be able to provide a solution that simplifies things.

That’s kind of the same thing with the web. We want to simplify the solutions for web developers so they’re not having to say, “Okay, what am I going to target? Am I going to target Edge? Am I going to target Safari?”

JP: Or is my web app going to work correctly in six years or even six months from now?

JB: Right!

JP: Besides web standards, are there other types of standardization you are passionate about, either inside the LF or in your spare time? 

JB: It’s interesting because I think in my career, I’ve followed this journey of first getting involved because it was intellectually interesting to me. Then it was about getting involved because it was about  making my job easier. Like, how does this help me do business more effectively? How does this help me make my immediate life, life as a developer, and my life as an internet consumer a little bit nicer?

Beyond that, you start to think of the order of magnitude: our standardization activities’ social impact. I often think about the role that standards have played in improving the lives of everyday people. For the last 100 years, we have had building standards, fire standards, and safety standards, all of these things. And because they developed, adopted, and implemented in global policy, they have saved people’s lives. 

Apply that to tech — of course, it makes sense that you would have safety standards to prevent the building from burning down — so what is the version of that for technology? What’s the fire safety standard for the web? And how do we actually think about the standards that we make, impacting people and protecting them the way that those other standards did?

One of the things that have changed in the last few years is that the Technical Advisory Group group or “TAG” at the W3C are considering more of the social impact questions in their work. TAG is a group of architects elected by the W3C membership to take a horizontal/global view of the technologies that the W3C standardizes. These folks say, “okay, great; you’re proposing that we standardize this API, have you considered it from an accessibility standpoint? Have you considered it from, you know, ease of use, security?” and that sort of thing.

In the last few years, they started looking at it from an ethical standpoint, such as, “what are the questions of privacy?” How might this technology be used for the benefit of the average person? And also, perhaps, how could it potentially be used for evil? And can we prevent that reality? 

So one of the thingsI think is most exciting, is the types of technologies that are advancing today that are less about can we make X and Y interoperable, but can we make X and Y interoperable in a safe, ethical, economical, and ecological fashion — the space around NFT’s right now as a case in point. And can we make technology beneficial in a way that goes above and beyond “okay, great, we made the website, quick click here.”

So C2PA, I think, is an excellent example of a standardization activity that the LF supports could benefit people. One of the big issues of the last several years is the authenticity of media that we consume things from — whether it was altered, or synthesized in some fashion, such as what we see with deepfakes. Now, the C2PA is not going to be able to and would not say if a media file is fake. Rather, it would allow an organization to ensure that the media they capture or publish can be analyzed for tampering between steps in the edit process or the time an end user consumes it.  This would allow organizations and people to have more trust in the media they consume.

JP: If there was one thing you could change about open source and open standards communities, what would it be?

JB: So my M.O. is to try and make these spaces more human interoperable. With an open source project or open standards project, we’re talking about some kind of technical interoperability problem that we want to solve. But it’s not usually the technical issues that cause delays or serious issues — nine times out of ten; it comes down to some human interoperability problem. Maybe it’s language differences, cultural differences, or expectations — it’s process-oriented. There’s some other thing that may cause that activity to fail to launch. 

So if there were something that I could do to change communities, I would love to make sure that everybody has resources for running great and effective meetings. One big problem with some of these activities is that their meetings could be run more effectively and more humanely. I would want humane meetings for everyone.

JP: Humane meetings for everyone! I’m pretty sure you could be elected to public office on that platform. <laughs>. What else do you like to do with your spare time, if you have any?

JB: I love to read; we’ve got a book club at OpenJS that we’re doing, and that’s fun. So, in my spare time, I like to take time to read or do a crossword puzzle or something on paper! I’m so sorry, but I still prefer paper books, paper magazines, and paper newspapers.

JP: Somebody just told me recently that they liked the smell of paper when reading a real book.

JB: I think I think they’re right; I think it feels better. I think it has a distinctive smell, but there’s also something very therapeutic and analog about it because I like to disconnect from my digital devices. So you know, doing something soothing like that. I also enjoy painting outdoors and going outside, spending time with my four-year-old, and that kind of thing.

JP: I think we all need to disconnect from the tech sometimes. Jory, thanks for the talk; it’s been great having you here.

The post Interview with Jory Burson, Community Director, OpenJS Foundation on Open Source Standards appeared first on Linux Foundation.

Pages