Tag

business

Browsing

Companies that have established open-source program offices over the years now need more C-suite oversight to drive education, awareness, and use of open-source software. This open-source program sets the stage for an expanded role for officers.

Incorporating open-source technology gives organizations an ecosystem that expands the user base, resulting in loyalty and stickiness. It also brings with it the need for greater executive oversight of open-source initiatives. Staying on top of open-source security best practices is critically important, and disclosing and patching vulnerabilities is essential.

Javier Perez, Perforce’s chief open-source evangelist, sees a trend to drive open source in 2023. More organizations will realize that open-source software is critical to their operations and will move from consumers to participants with increased use and adoption for business-critical infrastructure.

All software now contains open-source components, but some companies don’t even know how much open source they use, he said.

More businesses are no longer the only consumers of open source. They are becoming active contributors, promoting and educating their engineering teams. Therefore, whether using completely open-source or commercial products with embedded open-source code, organizations need to pay more attention to their software license management.

How to do this requires a focus on the duties of open-source program offices. According to Perez, half of the organizations at the Software Summit run by Perforce have such executives.

“It’s becoming more popular and strategic. Talking about the strategy open-source projects companies are going to invest in,” Perez told LinuxInsider.

Threat Zones and the Role of the CISO

Despite the growing use of open source across all industries, ongoing malicious software supply chain attacks will slow open source adoption this year, warned Paul Speciale, chief marketing officer at data management firm Scality.

Malware and ransomware attacks have increased so much that the world is now infiltrated every few minutes, causing businesses millions of dollars in losses per incident and consuming untold IT cycles. He explained that we have seen security compromises in commercial software solutions, as seen in recent high-profile attacks.

“Open-source software dependency will become a growing threat vector, requiring enterprises to more carefully evaluate and test these technologies before deploying them on a large scale,” Special told LinuxInsider.


Eric Cole, a consultant at data security firm Theon Technology and former CIA professional hacker, suggested the focus this year would be on regulating the software, not unlike actions already taken by European governments.

“We will see a major shift in the CISO (chief information security officer) role, including increased hiring and firing of CISO positions,” Cole told LinuxInsider.

He predicted that the position would pivot to hiring more business-oriented individuals who can communicate with the board, rather than existing technical people who currently fill the role.

Integrated Offering Major OSS Citadel

A continued move toward modular software solutions will drive new adoption toward open source solutions in 2023, according to Moses Gutman, CEO and co-founder of machine learning operations platform developer ClearML.

MLOps teams should consider open-source infrastructure instead of being locked into long-term contracts with cloud providers. While organizations doing machine learning at hyper-scale can undoubtedly benefit from integration with their cloud providers, it forces these companies to work the way the provider wants them to work, he offered. .

“Open source provides flexible customization, cost savings, and efficiency. You can even modify open-source code to make sure it works exactly the way you want it to.” With this, it’s becoming a more viable option,” Gutman told LinuxInsider.

One of the factors slowing MLOps adoption is the overabundance of point solutions. That doesn’t mean they don’t work, he offered. But they may not integrate well together and leave gaps in the workflow.

Gutman said, “Because of that, I firmly believe that 2023 is the year the industry moves toward a unified, end-to-end platform built from modules that can be used individually and integrated seamlessly.” could.”

Open-source adoption in 2023

This year, it will become clear that open source is not just the domain of large enterprises like IBM, Google, Red Hat and Microsoft. It is now a necessity for every industry and small companies as well.

“We see a lot of banks now contributing directly to open source as they specialize. So, we see adoption across all industries. Many companies are becoming more receptive to open source and open source becoming more actively involved in maintaining and advancing the

Scalable cloud-based solutions are widely popular among IT professionals these days. The cost, convenience and reliability of ready-to-use software as a service make this disruptive technology a favorable choice.

Still, the market needs some reassurance that backing up to the cloud is a smart and secure thing to do, as suggested by Paul Evans, CEO of UK-headquartered data management provider RedStore.

Redstor has over 40,000 customers globally, over 400 partners, and over 100 million restores a year. Last month in London, RedStore was named Hosted Cloud Vendor of the Year at the 2022 Technology Reseller Awards.

“Companies should not only say goodbye to on-premises boxes, they should celebrate because their removal reduces the risk of ransomware or the effects of fire or flooding in the data center,” Evans told TechNewsWorld.

SaaS is a software delivery model that provides great agility and cost-effectiveness for companies. This makes it a reliable choice for many business models and industries. It is also popular among businesses due to its simplicity, user accessibility, security and wide connectivity.

According to Evans, SaaS trends are disrupting the industry this year. Spiceworks Jiff Davis predicts that next year half of all workloads will be in the cloud.

Many organizations are undertaking cloud-first migration projects. Of particular interest are hard-hit businesses that are looking for infrastructure through operational excellence (OpEx) models and frameworks to avoid huge upfront investments.

“Data will become increasingly cloud-native in the coming year, especially with the continued growth of Kubernetes, Microsoft 365, Google Workspace and Salesforce,” he said.

Danger Landscape Driving Factor

Grand View Research recently reported that the global managed services market, which was valued at US$ 239.71 billion in 2021, is expected to grow at a compound annual growth rate (CAGR) of 13.4 percent from this year to 2030. Many Managed Service Providers (MSPs) are looking to become more service driven.

At the same time, value-added resellers are looking to become cloud service providers. Evans said other distributors are trying to figure out which way they might be the best fit.

“The backdrop of this is a threat landscape that has changed dramatically, especially after Russia’s invasion of Ukraine. State-sponsored malware and cyber warfare are coming to the fore in opposition to renegade shrewd criminals,” he said. .

US President Joe Biden has called for the private sector to step in and close its “digital doors” to protect critical infrastructure. Sir Jeremy Fleming, director of the UK’s intelligence, cyber and security agency GCHQ, warned that the Russian regime is identifying institutions and organizations to bring down, making it only a matter of time before the attacks come.

“Threats are not only increasing in scale and complexity. The range of ransomware attacks makes it abundantly clear that companies of all shapes and sizes will increasingly become targets. As a result, we will see more businesses increase their IT, cyber security and compliance Enlisting MSPs to run the programs,” predicted Evans.

During our conversation, I discussed further with Evans how RedStore and other providers can strengthen digital security.

TechNewsWorld: What’s unique about Redstor technology compared to other solutions for data management and disaster recovery?

Paul Evans: Our approach focuses on the concerns of businesses regarding their risk position, resource constraints and profitability challenges while IT skills are lacking. Redstor offers what we believe is the smartest and simplest backup platform for MSP.

One factor is the ease associated with onboarding. With three clicks and a password, users are up and running and can scale easily. In addition, it requires lightweight support for multiple data connectors and is purpose-built from the ground up for MSPs that manage multiple accounts.

It’s not a monster of some Frankenstein’s hastily achieved solutions bolted together.

What makes Redstor’s platform technically smart?

Evans: Whether MSPs are protecting data on-premises or in the cloud – Microsoft 365, Google Workspace, or cloud-native Kubernetes – they can do it easily and all with one app. By being able to span the on-premises cloud and SaaS worlds from a single location, rather than moving to several different interfaces, MSPs save time and money.

Redstor is smart because we enable user-driven recovery by streaming backup data on demand, so organizations have everything they need to get straight up and running in the event of data loss.

You don’t need to mirror everything, copy everything, or recover everything before it starts working again. During an outage, InstantData technology restores critical data back in seconds, while less critical recovery continues in the background.

This platform is also smart because it offers more than just backup. You also get archive and disaster recovery with high-end search and insights – all from one app.

Redstor is influenced by AI, and our machine learning model automatically detects and isolates suspicious files in backups so that they can be removed for malware-free recovery. MSP can do data classification with tagging. In the future, we will introduce anomaly detection.

How do cloud-based SaaS data protection and recovery systems compare to other solutions?

Evans: Organizations find that they need multiple boxes onsite to quickly pull data down to get a faster experience with the cloud. But on-premises Frankenstein solutions, coupled with technology from multiple acquisitions, aren’t going to meet today’s challenges.

Paul Evans, Redstor .  CEO of
Redstore CEO Paul Evans

Also, with hardware, there can be supply-chain issues and the lack of critical components such as semiconductors. Moving your data security to the cloud eliminates both these issues and the responsibility rests entirely on the MSP.

Without cloud-based security, you lack the best means of securing data. SaaS security is constantly updated and built in. Free updates are provided on a regular release cycle to keep customers ahead of the risks. MSP ensures reliable and secure connectors for many sources and popular applications now and in the future.

Also, storing backups securely in geographically separated data centers creates an air gap between live data and backups to enhance security.

What is driving the popularity of SaaS data protection?

Evans: The most important reason was when being onsite became problematic during the pandemic. Those with hardware-connected data security faced challenges fixing and swapping out the box. Many organizations also do not want boxes onsite because they are hard to come by because of supply-chain issues. Furthermore, the devices are known to be ransomware magnets.

SaaS overcomes these issues and more. MSPs are open to data portability requests and enable tools and services designed for today’s challenges. They can also complete the services digitally and distributors appreciate the value of SaaS made to channel supplied through online marketplaces.

Most SaaS applications now stress the need for a separate backup. More people are realizing that just because you have Microsoft doesn’t mean you can’t be compromised. You may have an internal user that destroys the data, or you may not have enough retention. Backing up SaaS applications is now the fastest growing part of our business.

What should an MSP look for from a vendor besides good technical support?

Evans: Technology built for MSPs should be partner-friendly from the start and include deep sales and marketing support. It should offer attractive margins with clear, transparent pricing so that MSPs can easily sell services.

The software should rapidly enhance data security, and by the end of the first negotiation, MSPs should be able to offer a proof of concept by deploying backups and performing rapid recovery to close deals faster.

Vendors are required to provide MSPs with the ability to purchase whatever they need from a single source, whether it’s protection for a Kubernetes environment, malware detection for backup, or data classification.

The key is also an interface to eliminate the complexity of switching between different solutions and consoles. Plus, having the ability to view and manage data from a single interface saves valuable time.

A vendor’s platform should be designed for multi-tenancy and provide a high-level view of MSP’s own usage and customer consumption. It also requires that the types of data protected and where it resides. The vendor must have a history of using new advances, particularly AI, to detect and remove malware, data classification and cyberattack predictions.

How should businesses assess seller suitability?

Evans: Many vendors make a bold claim to be the best solution to the challenges in the market. MSPs should receive direct feedback from their peers and adequately field-test the solutions.

Top 20 Backup Software, Top 20 . Check the rankings for the G2 lists online backup software, and other user-supported reviews. Focus on reports based on user satisfaction and review data. For example, Redstor ranks first with the G2.

Also look for vendors that provide a clear road map of future growth that the MSP should be able to influence. Lastly, MSPs should focus on smart solutions that provide simplified security.

Do you know whether your company data is clean and well managed? Why does it matter anyway?

Without a working governance plan, you may have no company to worry about – data-wise.

Data governance is a collection of practices and procedures establishing rules, policies and procedures that ensure data accuracy, quality, reliability and security. It ensures the formal management of data assets within an organization.

Everyone in business understands the need to have and use clean data. But making sure it’s clean and usable is a bigger challenge, according to David Kolinek, vice president of product management at Atacama.

This challenge is compounded when business users have to rely on scarce technical resources. Often, no one person oversees data governance, or that person doesn’t have a complete understanding of how the data will be used and how to clean it up.

This is where Atacama comes into play. The company’s mission is to provide a solution that even people without technical knowledge, such as SQL skills, can use to find the data they need, evaluate its quality, understand any issues How to fix that and determine if that data will serve their purposes.

“With Atacama, business users don’t need to involve IT to manage, access and clean their data,” Kolinek told TechNewsWorld.

Keeping in mind the users

Atacama was founded in 2007 and was originally bootstrapped.

It started as a part of a consulting company, Edstra, which is still in business today. However, Atacama focused on software rather than consulting. So management spun off that operation as a product company that addresses data quality issues.

Atacama started with a basic approach – an engine that did basic data cleaning and transformation. But it still requires an expert user because of the user-supplied configuration.

“So, we added a visual presentation for the steps enabling things like data transformation and cleanup. This made it a low-code platform because users were able to do most of the work using just the application user interface. But that’s right now.” was also a fat-client platform,” Kolinek explained.

However, the current version is designed with the non-technical user in mind. The software includes a thin client, a focus on automation, and an easy-to-use interface.

“But what really stands out is the user experience, made up of the seamless integration that we were able to achieve with the 13th version of our engine. It delivers robust performance that is crafted to perfection,” he said. offered.

Digging deeper into data management issues

I asked Kolinek to discuss the issues of data governance and quality further. Here is our conversation.

TechNewsWorld: How is Atacama’s concept of centralizing or consolidating data management different from other cloud systems such as Microsoft, Salesforce, AWS and Google Cloud?

David Kolinek: We are platform agnostic and do not target a specific technology. Microsoft and AWS have their own native solutions that work well, but only within their own infrastructure. Our portfolio is wide open so it can serve all use cases that should be included in any infrastructure.

In addition, we have data processing capabilities that not all cloud providers have. Metadata is useful for automated processing, generating more metadata, which can be used for additional analysis.

We have developed both these technologies in-house so that we can provide native integration. As a result, we can provide a better user experience and complete automation.

How is this concept different from the notion of standardization of data?

David Kolinek
David Kolinek
Vice President of Product Management,
atacama

Kolinek: Standardization is just one of many things we do. Typically, standardization can be easily automated, in the same way that we can automate cleaning or data enrichment. We also provide manual data correction when resolving certain issues, such as missing Social Security numbers.

We cannot generate SSN but we can get date of birth from other information. So, standardization is no different. It is a subset of things that improve quality. But for us it is not just about data standardization. It is about having good quality data so that the information can be leveraged properly.

How does Atacama’s data management platform benefit users?

Kolinek: User experience is really our biggest advantage, and the platform is ideal for handling multiple individuals. Companies need to enable both business users and IT people when it comes to data management. This requires a solution for business and IT to collaborate.

Another great advantage of our platform is the strong synergy between data processing and metadata management that it provides.

Most other data management vendors cover only one of these areas. We also use machine learning and a rules-based approach and validation/standardization, both of which, again, are not supported by other vendors.

Furthermore, because we are ignorant of technology, users can connect to many different technologies from a single platform. With edge processing, for example, you can configure something in the Atacama One once, and the platform will translate it for different platforms.

Does Atacama’s platform lock-in users the same way proprietary software often does?

Kolinek: We have developed all the main components of the platform ourselves. They are tightly integrated together. There has been a huge wave of acquisitions in this space lately, with big sellers buying out smaller sellers to fill in the gaps. In some cases, you are actually buying and managing not one platform, but several.

With Atacama, you can buy just one module, such as Data Quality/Standardization, and later expand to others, such as Master Data Management (MDM). It all works together seamlessly. Just activate our modules as you need them. This makes it easy for customers to start small and expand when the time is right.

Why is the Integrated Data Platform so important in this process?

Kolinek: The biggest advantage of a unified platform is that companies are not looking for a point-to-point solution to a single problem like data standardization. It is all interconnected.

For example, to standardize you must verify the quality of the data, and for that, you must first find and catalog it. If you have an issue, even though it may seem like a discrete problem, it probably involves many other aspects of data management.

The beauty of an integrated platform is that in most use cases, you have a solution with native integration, and you can start using other modules.

What role do AI and ML play today in data governance, data quality and master data management? How is this changing the process?

Kolinek: Machine learning enables customers to be more proactive. First, you’ll identify and report a problem. One has to check what went wrong and see if there is anything wrong with the data. You would then create a rule for data quality to prevent repetition. It’s all reactive and based on something being broken down, found, reported and fixed again.

Again, ML lets you be proactive. You give it training data instead of rules. The platform then detects differences in patterns and identifies discrepancies to help you realize there was a problem. This is not possible with a rule-based approach, and is very easy to measure if you have a large amount of data sources. The more data you have, the better the training and its accuracy.

Aside from cost savings, what benefits can enterprises gain from consolidating their data repositories? For example, does it improve security, CX results, etc.?

Kolinek: This improves safety and minimizes potential future leaks. For example, we had customers who were storing data that no one was using. In many cases, they didn’t even know the data existed! Now, they are not only integrating their technology stack, but they can also see all the stored data.

It is also very easy to add newcomers to the platform with consolidated data. The more transparent the environment, the sooner people will be able to use it and start getting value.

It is not so much about saving money as it is about leveraging all your data to generate a competitive advantage and generate additional revenue. It provides data scientists with the means to build things that will drive business forward.

What are the steps in adopting a data management platform?

Kolinek: Start with a preliminary analysis. Focus on the biggest issues the company wants to tackle and select platform modules to address them. It is important to define goals at this stage. Which KPIs do you want to target? What level of ID do you want to achieve? These are questions you should ask.

Next, you need a champion to drive execution and identify the key stakeholders driving the initiative. This requires extensive communication between various stakeholders, so it is important that one focuses on educating others about the benefits and helping the teams on the system. Then comes the implementation phase where you address the key issues identified in the analysis, followed by the rollout.

Finally, think about the next set of issues that need to be addressed, and if necessary, enable additional modules in the platform to achieve those goals. The worst part is buying a device and providing it, but not providing any service, education or support. This will ensure that the adoption rate will be low. Education, support and service are very important for the adoption phase.