Category

Computing

Category

Is your favorite Linux desktop Cinnamon, MATE or Xfce? Or are you longing for something different and potentially better?

Then one of your best options is the upgrade to Linux Mint 21 “Vanessa” released on July 30th. It comes in a choice of Ubuntu- or Debian-base flavors.

It is an important step for me to make this recommendation. Once my daily Linux driver, I had a major fallout with this distribution several years ago, when an upgrade caused some troubling issues, leading to unpleasant reactions to — and no solution at all — Linux Mint tech support. from the community.

I then jumped into Linux Mint, a near-clone of Phaeron OS, and was a happy user until the distro’s developers made a radical design change and moved away from the traditional Cinnamon desktop.

So I jumped to distros again. I had reviewed the then new Cinnamon Remix distro released by an independent Linux developer. My go-to Linux distro became Ubuntu Cinnamon Remix, later renamed Cinabuntu. I’ve been very happy with its performance and usability options since then.

The ability to pick and choose operating system and configuration options is one of the shining beads you can polish your way to with Linux. It is not possible with Windows or macOS to be able to quickly install a replacement OS with the same look and feel.

However, with the release of Linux Mint 21 my Linux reviewer got the best of me. I was curious what I was missing.

I have detected some features that are not available in my current Cinnamon version. Those new features are in the MATE and Xfce versions as well. LM 21 versions include the latest versions of the three supported desktop environments: Cinnamon 5.4, Xfce 4.16, and MATE 1.26.

Read on to see what’s pulling me back to Linux Mint. Since Cinnamon is my favorite desktop, I focused on that version for this review.

hello old friend

The Vanessa release rekindled my appreciation for how tightly knit Linux Mint is as a computing platform. From the initial loading of the live session DVD to the impeccable installation, I was up and running in less than 30 minutes.

The welcome screen is becoming a standard setup routine for Linux installations. They can all take lessons on how to do this correctly using Linux Mint as an example. Even for experienced Linux users, Linux Mint’s approach is fast and convenient to perform all first-run tasks.

The left column panel of the panel provides general information, documentation, and a great index for completing the first steps. This is especially useful for new users who are unfamiliar with Linux in general – and LM in particular.

The main window area walks you through each step of updating system components and basic desktop configuration. Each section briefly describes what is included. The green themed launch button sets each part of the process in motion.

Steps include desktop color selection, choosing a traditional or modern panel layout, updating drivers and system components, setting up system settings, and software manager. The process also includes activating the built-in firewall, which is an item that many users overlook.

Linux Mint 21 Welcome Screen

The Linux Mint 21 welcome screen guides you through all the setup steps after installation, and it’s also a handy reminder that updates need to be made from time to time.


desktop difference

Design and usability features are one of the reasons I favor the Cinnamon desktop. It has one of the most detailed and organized configuration panels of any Linux distribution.

The System Settings panel keeps all the configuration options in one place. But unlike other desktop layouts with very few options, Linux Mint organizes all system controls into four general categories. In total, 40 icons hide related subcategories until you click an icon to open it.

KDE Plasma is the only other desktop with such an amount of configuration options. But that design is a series of separate settings panels that scatter controls and user options across a lot of menu locations.

While the configuration options available in the MATE and Xfce versions are less extensive, they still offer the ability to customize the look and feel to suit your computing needs.

Linux Mint does a better job than other desktops in how it handles the screen design and usability aspects. It has a wide range of quick access tools called desktops that reside on the desktop screen. Its use of applets that reside on the lower panel adds flexibility.

lm also provides a collection of extensions that provide even more usability options (similar to those available in the KDE Plasma desktop). This combination of features is a solid reason to try this distro.

Linux Mint 21 Desktop Configuration Options

The desktop configuration options available in the MATE and Xfce versions are less extensive than in Cinnamon. They still offer the ability to have the same look and feel as your computing needs.


under the hood

Linux Mint 21 is based on Ubuntu 22.04 and provides a full WIMP display like Windows, Icon, Menu, Pointer. This is a Long Term Support (LTS) release supported until 2027.

Vanessa, which continues the LM’s imagination for naming all releases with female names ending in the letter “A”, is packed with notable improvements in performance, compatibility, and stability. It ships with the Linux kernel 5.15 LTS.

Other changes include a new NTFS file system driver that simplifies interaction with Windows partitions, improvements to the default EXT4 file system, as well as improved hardware support, security patches and bug fixes.

A major Bluetooth change to the LM Blueman circuitry replaces the Blueberry app, which relies on GNOME-desktop plumbing. Like Blueberry, Blueman is desktop-agnostic and integrates well across all environments. It depends on the standard bluez stack and works universally, including from the command line.

Blueman Manager and Tray Icons have features that were not previously available in Blueberry. It handles more information for connection monitoring or troubleshooting Bluetooth issues and brings better connectivity to the headset and audio profiles.

Linux Mint 21 Cinnamon Desktop

Linux Mint 21 classic Cinnamon desktop design sports a favorites column, application category list, and a changing sublist of installed titles.


pain point solution

Welcome to Vanessa. Its lack in earlier releases was a usability issue. To address this, a new XApp (Linux Mint Exclusive Application) project called xapp-thumbnailers was developed for Linux Mint 21.

Process Monitor is a pain point solution for me. It places a special icon in the system tray when automated tasks are running in the background. Such tasks can slow down the performance of the system until it is completed. This new monitor is a silent alert that explains computer slowdowns.

Timeshift was an independent project for backing up and restoring OSes. The producer abandoned the application. LM took over the maintenance of Timeshift prior to the release of the LM 21. Timeshift is now an XApp.

One immediate benefit is the change in the way rsync mode works. It now calculates the space required for the next OS snapshot storage. If there is less than 1 GB of free space on the disk when the snapshot is executed, it quits proceeding.

Another pain point remedy is how LM21 now handles package removal. This prevents removal from the main menu (right-click, Uninstall) if the evaluation shows that other programs will be affected. This triggers an error message and stops the operation.

If no damage is found to major system components, uninstalling an application from the main menu also removes dependencies for applications that were installed automatically and are no longer needed.

Linux Mint 21 Scale and Expo Window View

Scale and Expo window views in Cinnamon are triggered by heated corners and applets on the lower panel.


ground level

The computer hardware requirements for Linux Mint 21 have not changed. You need a modern computer because LM is not as light on system resources as it used to be. That means a box with a 64-bit processor, at least 2GB of RAM, and 15GB of free space.

If you need help installing Linux Mint 21, the Linux Mint website has a comprehensive installation guide. But that shouldn’t be a possibility. Installation Engine is well polished. Most of my computers run multiple partitions, which usually forces manual intervention.

The LM 21 installer does not stumble. It simply asked where to put the OS. The installer handled all the splitting and adjustments in the background.


suggest a review

Is there a Linux software application or distro that you would like to recommend for review? Something you love or want to know?

Email me your thoughts and I’ll consider them for future columns.

And use the Reader Comments feature below to provide your input!

Scalable cloud-based solutions are widely popular among IT professionals these days. The cost, convenience and reliability of ready-to-use software as a service make this disruptive technology a favorable choice.

Still, the market needs some reassurance that backing up to the cloud is a smart and secure thing to do, as suggested by Paul Evans, CEO of UK-headquartered data management provider RedStore.

Redstor has over 40,000 customers globally, over 400 partners, and over 100 million restores a year. Last month in London, RedStore was named Hosted Cloud Vendor of the Year at the 2022 Technology Reseller Awards.

“Companies should not only say goodbye to on-premises boxes, they should celebrate because their removal reduces the risk of ransomware or the effects of fire or flooding in the data center,” Evans told TechNewsWorld.

SaaS is a software delivery model that provides great agility and cost-effectiveness for companies. This makes it a reliable choice for many business models and industries. It is also popular among businesses due to its simplicity, user accessibility, security and wide connectivity.

According to Evans, SaaS trends are disrupting the industry this year. Spiceworks Jiff Davis predicts that next year half of all workloads will be in the cloud.

Many organizations are undertaking cloud-first migration projects. Of particular interest are hard-hit businesses that are looking for infrastructure through operational excellence (OpEx) models and frameworks to avoid huge upfront investments.

“Data will become increasingly cloud-native in the coming year, especially with the continued growth of Kubernetes, Microsoft 365, Google Workspace and Salesforce,” he said.

Danger Landscape Driving Factor

Grand View Research recently reported that the global managed services market, which was valued at US$ 239.71 billion in 2021, is expected to grow at a compound annual growth rate (CAGR) of 13.4 percent from this year to 2030. Many Managed Service Providers (MSPs) are looking to become more service driven.

At the same time, value-added resellers are looking to become cloud service providers. Evans said other distributors are trying to figure out which way they might be the best fit.

“The backdrop of this is a threat landscape that has changed dramatically, especially after Russia’s invasion of Ukraine. State-sponsored malware and cyber warfare are coming to the fore in opposition to renegade shrewd criminals,” he said. .

US President Joe Biden has called for the private sector to step in and close its “digital doors” to protect critical infrastructure. Sir Jeremy Fleming, director of the UK’s intelligence, cyber and security agency GCHQ, warned that the Russian regime is identifying institutions and organizations to bring down, making it only a matter of time before the attacks come.

“Threats are not only increasing in scale and complexity. The range of ransomware attacks makes it abundantly clear that companies of all shapes and sizes will increasingly become targets. As a result, we will see more businesses increase their IT, cyber security and compliance Enlisting MSPs to run the programs,” predicted Evans.

During our conversation, I discussed further with Evans how RedStore and other providers can strengthen digital security.

TechNewsWorld: What’s unique about Redstor technology compared to other solutions for data management and disaster recovery?

Paul Evans: Our approach focuses on the concerns of businesses regarding their risk position, resource constraints and profitability challenges while IT skills are lacking. Redstor offers what we believe is the smartest and simplest backup platform for MSP.

One factor is the ease associated with onboarding. With three clicks and a password, users are up and running and can scale easily. In addition, it requires lightweight support for multiple data connectors and is purpose-built from the ground up for MSPs that manage multiple accounts.

It’s not a monster of some Frankenstein’s hastily achieved solutions bolted together.

What makes Redstor’s platform technically smart?

Evans: Whether MSPs are protecting data on-premises or in the cloud – Microsoft 365, Google Workspace, or cloud-native Kubernetes – they can do it easily and all with one app. By being able to span the on-premises cloud and SaaS worlds from a single location, rather than moving to several different interfaces, MSPs save time and money.

Redstor is smart because we enable user-driven recovery by streaming backup data on demand, so organizations have everything they need to get straight up and running in the event of data loss.

You don’t need to mirror everything, copy everything, or recover everything before it starts working again. During an outage, InstantData technology restores critical data back in seconds, while less critical recovery continues in the background.

This platform is also smart because it offers more than just backup. You also get archive and disaster recovery with high-end search and insights – all from one app.

Redstor is influenced by AI, and our machine learning model automatically detects and isolates suspicious files in backups so that they can be removed for malware-free recovery. MSP can do data classification with tagging. In the future, we will introduce anomaly detection.

How do cloud-based SaaS data protection and recovery systems compare to other solutions?

Evans: Organizations find that they need multiple boxes onsite to quickly pull data down to get a faster experience with the cloud. But on-premises Frankenstein solutions, coupled with technology from multiple acquisitions, aren’t going to meet today’s challenges.

Paul Evans, Redstor .  CEO of
Redstore CEO Paul Evans

Also, with hardware, there can be supply-chain issues and the lack of critical components such as semiconductors. Moving your data security to the cloud eliminates both these issues and the responsibility rests entirely on the MSP.

Without cloud-based security, you lack the best means of securing data. SaaS security is constantly updated and built in. Free updates are provided on a regular release cycle to keep customers ahead of the risks. MSP ensures reliable and secure connectors for many sources and popular applications now and in the future.

Also, storing backups securely in geographically separated data centers creates an air gap between live data and backups to enhance security.

What is driving the popularity of SaaS data protection?

Evans: The most important reason was when being onsite became problematic during the pandemic. Those with hardware-connected data security faced challenges fixing and swapping out the box. Many organizations also do not want boxes onsite because they are hard to come by because of supply-chain issues. Furthermore, the devices are known to be ransomware magnets.

SaaS overcomes these issues and more. MSPs are open to data portability requests and enable tools and services designed for today’s challenges. They can also complete the services digitally and distributors appreciate the value of SaaS made to channel supplied through online marketplaces.

Most SaaS applications now stress the need for a separate backup. More people are realizing that just because you have Microsoft doesn’t mean you can’t be compromised. You may have an internal user that destroys the data, or you may not have enough retention. Backing up SaaS applications is now the fastest growing part of our business.

What should an MSP look for from a vendor besides good technical support?

Evans: Technology built for MSPs should be partner-friendly from the start and include deep sales and marketing support. It should offer attractive margins with clear, transparent pricing so that MSPs can easily sell services.

The software should rapidly enhance data security, and by the end of the first negotiation, MSPs should be able to offer a proof of concept by deploying backups and performing rapid recovery to close deals faster.

Vendors are required to provide MSPs with the ability to purchase whatever they need from a single source, whether it’s protection for a Kubernetes environment, malware detection for backup, or data classification.

The key is also an interface to eliminate the complexity of switching between different solutions and consoles. Plus, having the ability to view and manage data from a single interface saves valuable time.

A vendor’s platform should be designed for multi-tenancy and provide a high-level view of MSP’s own usage and customer consumption. It also requires that the types of data protected and where it resides. The vendor must have a history of using new advances, particularly AI, to detect and remove malware, data classification and cyberattack predictions.

How should businesses assess seller suitability?

Evans: Many vendors make a bold claim to be the best solution to the challenges in the market. MSPs should receive direct feedback from their peers and adequately field-test the solutions.

Top 20 Backup Software, Top 20 . Check the rankings for the G2 lists online backup software, and other user-supported reviews. Focus on reports based on user satisfaction and review data. For example, Redstor ranks first with the G2.

Also look for vendors that provide a clear road map of future growth that the MSP should be able to influence. Lastly, MSPs should focus on smart solutions that provide simplified security.

Canonical is emphasizing the security and usability suitability of Internet of Things (IoT) and edge devices management with its June 15 release of Ubuntu Core 22, a fully containerized Ubuntu 22.04 LTS variant optimized for IoT and edge devices Is.

In line with Canonical’s technology offering, this release brings Ubuntu’s operating system and services to the full range of embedded and IoT devices. The new release includes a fully extensible kernel to ensure timely responses. Canonical partners with silicon and hardware manufacturers to enable advanced real-time features on Ubuntu certified hardware.

“At Canonical, we aim to provide secure, reliable open-source access everywhere – from the development environment to the cloud, to the edge and across devices,” said Mark Shuttleworth, Canonical CEO. “With this release and Ubuntu’s real-time kernel, we are ready to extend the benefits of Ubuntu Core throughout the embedded world.”

One important thing about Ubuntu Core is that it is effectively Ubuntu. It is fully containerized. All applications, kernels and operating systems are strictly limited snaps.

This means it is ultra-reliable and perfect for unattended devices. It has removed all unnecessary libraries and drivers, said David Beamonte Arbushes, product manager for IoT and embedded products at Canonical.

“It uses the same kernel and libraries as Ubuntu and its flavors, and it’s something that developers love, because they can share the same development experience for every Ubuntu version,” he told LinuxInsider.

He said it has some out-of-the-box security features such as secure boot and full disk encryption to prevent firmware replacement, as well as firmware and data manipulation.

certified hardware key

Ubuntu’s certified hardware program is a key distinguishing factor in the industry’s response to Core OS. It defines a range of trusted IoT and edge devices to work with Ubuntu.

The program typically includes a commitment to continuous testing of certified hardware in Canonical’s laboratories with every security update throughout the device’s lifecycle.

Advantech, which provides embedded, industrial, IoT and automation solutions, strengthened its participation in the Ubuntu Certified Hardware program, said Eric Cao, director of Advantech Wise-Edge+.

“Canonical ensures that certified hardware undergoes an extensive testing process and provides a stable, secure and optimized Ubuntu core to reduce market and development costs for our customers,” he said.

Another usage example, Brad Kehler, COO of KMC Controls, is the security benefits that Core OS brings to the company’s range of IoT devices, which are purpose-built for mission-critical industrial environments.

“Safety is of paramount importance to our customers. We chose Ubuntu Core for its built-in advanced security features and robust over-the-air update framework. Ubuntu Core comes with a 10-year security update commitment that allows us to keep devices safe in the field for their longer life. With a proven application enablement framework, our development team can focus on building applications that solve business problems,” he said.

solving major challenges

IoT manufacturers face complex challenges to deploy devices on time and within budget. As the device fleet expands, so too does ensuring security and remote management are taxing. Ubuntu Core 22 helps manufacturers meet these challenges with an ultra-secure, resilient and low-touch OS, backed by a growing ecosystem of silicon and original design maker partners.

The first major challenge is to enable the OS for their hardware, be it custom or generic, the well-known Arbus. It’s hard work, and many organizations lack the skills to perform kernel porting tasks.

“Sometimes they have in-house expertise, but development can take a lot longer. This can affect both time and budget,” he explained.

IoT devices should be mostly unattended. They are usually deployed in places with limited or difficult access, he offered. It is therefore essential that they be extremely reliable. It is costly to send a technician to the field to recover a bricked or unstarted device, so reliability, low touch, and remote manageability are key factors in reducing OpEx.

He added that this also adds to the challenge of managing the software of the devices. A mission-critical and bullet-proof update mechanism is critical.

“Manufacturers have to decide early in their development whether they are going to use their own infrastructure or third parties to manage the software for the devices,” Arbus said.

Beyond Standard Ubuntu

The containerized feature of Core 22 extends beyond the containerized features in non-core Ubuntu OSes. In Ubuntu Desktop or Server, the kernel and operating system are .deb packages. Applications can run as .deb or snap.

“In Ubuntu Core, all applications are strictly limited snap,” Arbusue continued. “This means that there is no way to access them from applications other than using some well-defined and secure interfaces.”

Not only applications are snaps. So are the kernel and operating system. He said that it is really useful to manage the whole system software.

“Although classic Ubuntu OSes can use Snaps, it is not mandatory to use them strictly limited, so applications can have access to the full system, and the system can have access to applications.”

Strict imprisonment is mandatory in Ubuntu Core. Additionally, both the kernel and the operating system are strictly limited snaps. In addition, the classic Ubuntu versions are not optimized for size and do not include some of the features of Ubuntu Core, such as secure boot, full disk encryption, and recovery mode.

Other Essential Core 22 Features:

  • Real-time compute support via a real-time beta kernel provides high performance, ultra-low latency and workload predictability for time-sensitive industrial, telco, automotive and robotics use cases.
  • There is a dedicated IoT App Store in the dedicated App Store for each device running Ubuntu Core. It provides complete control over apps and can create, publish and distribute software on a single platform. The IoT App Store provides enterprises with a sophisticated software management solution, enabling a range of new on-premises features.
  • Transactional control for mission-critical over-the-air (OTA) updates of kernel, OS, and applications. These updates will always complete successfully or automatically revert to the previous working version so that a device cannot be “britched” by an incomplete update. Snap also provides delta updates to reduce network traffic, and digital signatures to ensure software integrity and provenance.

More information about Ubuntu Core 22 can be found at ubuntu.com/core.

Download images for some of the most popular platforms or browse all supported images here.

The KYY 15.6-inch Portable Monitor is a classy and functional portable monitor that works well as a permanent second monitor while traveling or for home and office use.

This portable display panel is lightweight and sturdy, making it a solid accessory for playing games. This greatly expands the field of view when using mobile phones or game consoles with smaller screens.

The large screen easily adapts to landscape or portrait orientation. Its multi-mode viewing feature offers built-in flexibility to improve work productivity as well as make leisure time fun and hassle-free.

Switching between modes depends on the performance features of the host computer. If provided, you use the computer’s display and orientation settings for Scene Mode, HDR Mode, and Three-in-One Display Mode view. The combination of Duplicate Mode/Extension Mode/Second Screen Mode makes this model quite suitable for meeting sharing scenarios.

Overall, this KYY portable monitor packs an impressive list of features at a low cost purchase. It’s currently available on Amazon in gray (pictured above) for a list price of $199.99, or in black for $219.99. At the time of this writing, Amazon has a “deal with” price for both colors at $161.49.

hands-on impressions

The brightness rating of this unit is 300 nits. By most standards, 300 nits is the mid-point for bright and clear visual acuity. Most low-end devices display at 250 nits.

Color saturation is slightly below the industry standard because this unit lacks Adobe RBG. But unless you intend to do a lot of graphic work and demand the best visual experience for gameplay and video viewing, not having Adobe RBG in the mix shouldn’t be a deal-breaker.

Despite these two factors, I was very satisfied with the 300-nit display’s sharpness and brightness. It was as good or better as my laptop and larger screen desktop monitor.

Overall, this portable monitor works with Windows, Linux, Chrome OS, and Mac Gear. It also plays well with game consoles including PS3, PS4, Xbox One and Nintendo Switch.

objective test

To evaluate portable monitors, I focus on one unit’s performance as another display. It is important to make sure that you make a suitable selection.

Portable monitors attached to computers and game consoles differ from a full desktop monitor. Portable monitors are convenient. But they may not be suitable enough to meet all your expectations.

For example, I often drag windows to another screen to expand screen real estate when working on various documents or video presentations. They come in handy when working on content creation or research.

KYY 15.6-inch Portable Monitor as a Laptop Display Extension

Simultaneous: The 14″ x 8″ viewing screen at 16:9 aspect ratio offers a fine-tuned second viewing panel next to a large-screen portable laptop.


This is an easy way to cut down on always navigating around multiple windows spread across multiple virtual workspaces, all of which share a single monitor. Keeping track of two side-by-side screens with different objects is a new habit for me.

This KYY portable display did its job well for graphics editing as well. It performed as well as the more expensive units I used with my office laptop and desktop.

My only complaint with this unit is a finicky toggle on the left vertical edge that wasn’t always responsive enough to access the panel’s menus for brightness settings.

what’s inside

The 15.6-inch unit sports a 1080p FHD IPS USB-C display. This is not a touch screen. But its performance and price offer a good collection of features.

Its slim profile of 0.3 inches is pretty standard for a portable monitor. The right vertical edge houses two USB Type-C full function ports and a mini-HDMI port. On the left vertical edge are an on/off button for settings and a toggle wheel for audio and video functions.

The first USB-C port is used for power supply. The second USB-C port is used for video transmission and power supply. Mini-HDMI port is used for video transmission but does not support power supply.

This is an important distinction. Portable monitors do not require a wall socket if the host computer or game console supports power through its Type-C USB port. But if you connect both the devices with HDMI cable, then you should use power plug for AC.

The KYY monitor comes with a USB-A to USB-C cable that can be connected to the included power plug as well as other devices. Two USB-C cables are also included.

The assortment of included cables and plugs is compatible with most laptops, smartphones, and PCs. However, not all smartphones are compatible.

You can plug a 3.5mm headphone into a port on the bottom left vertical edge of the panel. Two one-watt speakers are built into the middle of the left and right outer edges.

final thoughts

The KYY 15.6-inch Portable Monitor is an affordable solution for accessing your computing time for working, watching videos and gaming. It does not require any additional software and requires only minimal setup.

Once connected to the computer by cable, the host machine’s display settings will automatically detect the second monitor. You just select the options it provides for how you want it to work with your main display.

Apple’s 2022 Worldwide Developers Conference on Monday revealed the latest versions of the company’s own mobile and desktop operating systems, in kickoff of its annual week-long virtual high-tech show targeting developers and the Apple consumer experience .

The event showcased Apple’s major software updates to iOS 16, as well as updates to iPadOS, macOS for the company’s computers, and watchOS 9 for the Apple Watch.

Presentations highlighted the new MacBook Air with a 13.6-inch screen and Liquid Retina Display, as well as a new 13-inch MacBook Pro – both with enhancements powered by the new M2 Apple silicon chip.

This year’s conference was Apple’s first major opportunity to introduce some of its new product lines based on its processor designs.

Apple CEO Tim Cook claimed during his opening remarks that these platforms and the products they run provide amazing experiences for users and provide developers with incredible opportunities to create, build and develop apps that will change the world. to use their superpowers to cooperate. ,

“WWDC is designed to give this community what they need to do their best work. We love to support our developers beyond WWDC with extensive world-class support,” Cook said during the virtual keynote. said during

One particularly impressive announcement revealed the ability to use an iPhone as a webcam. A clip-on attachment will let users secure an iPhone to the top of a computer screen and there will be instant integration for FaceTime and other videoconferencing apps via the Continuity camera features.

macOS Ventura Continuity Camera Lets iPhone Act As Webcam

Continuity Camera uses an iPhone as a webcam on a Mac for videoconferencing. (image credit: Apple)


Developing developer community

Cook summarized Apple’s efforts to grow the developer community. This support for developers extends to several important initiatives.

One is the recently opened Developer Center across from Apple Park where developers can connect with Apple engineers and reach their global community. Another fall came with Apple’s first online tech talk for developers to learn more about new technologies and connect directly with Apple engineers in live sessions and one-on-one office hours.

“We are also committed to developing the next generation of developers, including through our efforts to reach underrepresented communities,” Cook said.

Apple’s Entrepreneur Camps include female, black and Latin founders and provide advice, inspiration and insight from Apple to developers from underrepresented communities.

The company started developer academies to teach students the fundamentals of code and other essential skills to find and create jobs in the app economy. These 17 coding centers are located around the world.

For example, last October Apple launched a racial equality and justice initiative in Detroit. In Saudi Arabia, Apple launched its first ever developer academy for women in February.

“We are pleased to say that we have expanded our developer community to over 34 million Apple developers,” Cook said. “Today we are going to advance our platform more than ever before for our developers and our users.”

macOS Extended Access

macOS Ventura, the latest version of the desktop operating system, takes the Mac experience to a whole new level with many integrated innovative features.

Stage Manager gives Mac users a new way to stay focused on the primary task while seamlessly switching between apps and windows.

Continuity Camera uses the iPhone as a webcam on the Mac to do things that weren’t possible before. Handoff comes to FaceTime, allowing users to initiate an iPhone or iPad call and pass it on to their Macs fluidly.

Mail and Messages gets new features that enhance the performance of apps. The Safari browser transitions into a passwordless future with Passkey.

“MacOS Ventura includes powerful features and new innovations that help make the Mac experience even better. New tools like Stage Manager make focusing on tasks and moving between apps and Windows easier and faster than ever before , and the Continuity Camera brings new videoconferencing features to any Mac, including Desk View, Studio Lite and more,” said Craig Federighi, Apple’s senior vice president of software engineering.

New features in Messages, cutting-edge search technology in Mail, and an updated design for Spotlight give Ventura a lot to offer and enrich the many ways customers use their Macs, he said.

Integrating Apps and Windows

The Apple software initiative brings new ways of working across apps and opens windows with Stage Manager, Continuity Camera features, and shared Safari browser tabs.

Stage Manager automatically organizes open apps and windows so users can see everything at a glance. The current window is displayed prominently in the center. Other open windows appear on the left and are easily switchable between tasks.

macOS Ventura Stage Manager

New features like Stage Manager in macOS Ventura help users stay focused. (image credit: Apple)


Users can group windows together while working on specific tasks or projects that require separate apps. Stage Manager works in conjunction with other macOS windowing tools—including Mission Control and Space—and users can now easily access their desktop with a single click.

The Shared tab group enables friends, family, and coworkers to share their favorite sites in Safari. They can see what other tabs are watching live. Users can also create a list of bookmarks on the shared start page, and even start Message conversations or FaceTime calls from Safari.

Many more sharing features

Messages on Mac can now edit or undo a recently sent message, mark a message as unread, or even recover accidentally deleted messages. New collaboration features make working with others faster and easier. For example, a user sharing a share sheet or file via Messages with drag and drop can choose to share a copy or collaborate.

If the user selects Collaboration, everyone on the message thread is automatically added. Edits for the shared document appear at the top of the thread. Users can also join SharePlay sessions from their Macs from within Messages so they can chat and participate in synchronized experiences.

Don’t count the pictures out of the picture. The new features provide a more consistent experience across Apple devices. Users can now find images in their photo library, across the system, and on the web. They can also search their photos by location, people, scenes or objects, and Live Text lets them search by text inside images.

iCloud Shared Photo Library users can now create and share a separate photo library among up to six family members. They can choose to share all of their current photos from their personal libraries, or share based on a start date or the people in the photos.

To be more productive, users can now take actions from Spotlight, such as starting a timer, creating a new document, or running a shortcut. In addition, Spotlight now includes rich results for actors, movies, actors and TV shows, as well as businesses and sports.

m2 macbook

Apple introduced a completely redesigned MacBook Air and an updated 13-inch MacBook Pro, both powered by the new M2 chip that pushes the M1’s performance and capabilities even further.

The slim design of the MacBook Air packs a large 13.6-inch Liquid Retina display, 1080p FaceTime HD camera, four-speaker sound system, up to 18 hours of battery life, and MagSafe charging.

macbook air magsafe charging

The new MacBook Air features MagSafe for dedicated charging when users are plugged in. (image credit: Apple)


It is now available in four finishes – Silver, Space Grey, Midnight and Starlight. The M2 also comes with the 13-inch MacBook Pro, which offers better performance, up to 24GB of integrated memory, ProRes acceleration, and up to 20 hours of battery life.

“Only with Apple Silicon can you create such a thin and light notebook with a fanless design and this combination of performance and capabilities,” said Greg Joswiak, Apple’s senior vice president of worldwide marketing.

“The M2 debuts the second generation of Apple’s M-series chips and extends the features of the M1. With power efficiency, unified memory architecture, and custom technology, this new chip is compatible with Apple’s most popular Mac notebooks, the MacBook Air, and the MacBook Air. The 13-inch brings even more performance and capabilities to the MacBook Pro,” he said.

MacBook Air Ports Side View

Apple’s new MacBook Air features MagSafe charging, which keeps two Thunderbolt ports available for connecting a variety of accessories. (image credit: Apple)


The new MacBook Air features a 13.6-inch Liquid Retina display, which has been angled up and around the camera to make room for the menu bar. The result is a larger display with thinner borders, offering more screen real estate.

At 500 nits brightness, it’s 25 percent brighter than previous models. In addition, the new display now supports a billion colors for more vibrant photos and movie watching.

The new MacBook Air and the updated 13-inch MacBook Pro join the 14- and 16-inch MacBook Pro with the M1 Pro and M1 Max to round out Apple’s laptop lineup.

MacBook Pro 13-inch M2

The M2 on the 13-inch MacBook Pro supports up to 24 GB of faster integrated memory and 100 Gb/s memory bandwidth for more efficient multitasking. (image credit: Apple)


Both new laptops will be available next month. Prices start at $1,199 for the MacBook Air and $1,299 for the 13-inch MacBook Pro.

watchOS 9

Apple also previewed watchOS 9, which brings new features and enhanced experiences to the wearable operating system. The Apple Watch will now have more watch faces to choose from with more complex complications that offer more detail and opportunities for personalization.

Apple Watch, watchOS 9

watchOS 9 brings new experiences and features, app updates, and creative ways to customize the Apple Watch. (image credit: Apple)


For example, the updated Workout app, advanced metrics, visualizations and training experiences inspired by high-performing athletes help users take their workouts to the next level.

watchOS 9 brings sleep stages to the Sleep app, and a new FDA-cleared AFib History feature provides deeper insight into a user’s condition. The new Medicines app makes it easy for users to manage, understand and track medicines easily and carefully.

This fall, watchOS 9 takes the Apple Watch experience to the next level with scientifically validated insights into fitness, sleep and heart health, said Jeff Williams, Apple’s Chief Operating Officer, while allowing users to build their own Apple Watch. more creative ways to do it.” Officer.

A collaboration between Linux computer and software firm System76 and HP, is pushing for commercial adoption of open-source software and hardware optimized for Linux.

System76 and HP on Thursday announced a new premium computer line designed to attract a wider audience to the developer-focused HP Dev One laptop computer.

HP’s new Dev One, System76’s popular Pop! _OS Powered by the Linux distribution, enables developers to create their ideal work experience with a range of tools to help them function at peak efficiency not available on other computing platforms.

The Pop!_OS platform features auto-tiling, workspaces and easy keyboard navigation. This flexibility allows software developers to create unique customized workflows, freeing up their coding capability.

Typically, Linux users install their preferred Linux platform as a replacement for the default Microsoft Windows on the computers they purchase. Relatively few OEMs create their own hardware line and tune it for specific Linux offerings.

Denver-based System76 developed its own customized version of the GNOME desktop environment to help propel Linux as the future of computing. The company developed Pop!_OS when Canonical decided to stop development of the Unity 8 desktop shell in 2017 and replaced its default desktop with GNOME 3.

“By bringing together our engineering, marketing and customer support, System76 [and] HP is introducing HP Dev One to combine powerful hardware with Pop!_OS optimized for the app dev community,” announced Carl Richel, CEO, System76.

hp dev one laptop

HP’s Dev One laptop has a strikingly classic appearance that thwarts Linux hardware and software optimizations designed for developers.


targeting coders

Software developers want devices optimized for the way they code, added Tylitha Stewart, vice president and global head of consumer services and subscriptions at HP.

“By working with System76, we are meeting this need and delivering a premium experience with Linux Pop!_OS pre-installed to deliver the new HP Dev One. The device has important features for developers including an optional Linux keyboard tuned with a Super key and designed to be more efficient at the core,” offered Stewart.

The companies hope the collaboration will accelerate the usefulness of Pop!_OS, pushing its limits beyond normal mainstream use for home and office computing. Pop! _OS development and innovation has always been a top priority for System76, says the renowned Jeremy Soler, System76’s lead engineer.

“We are working at a much faster pace than ever before to develop new features and adapt existing features to Pop!_OS,” he said.

Unique plan has potential

This interesting announcement shows HP realizes that there are enough markets for developers to focus on specific products, said Charles King, principal analyst at Pund-IT.

“Although the company has certified its laptops for Linux for many years and has offered Ubuntu as an option on some high-end mobile workstations, it leaves the installation and configuration work mostly to end-users. This new AMD-based solution and the partnership with System 76 changes that,” he told LinuxInsider.

The bigger question, however, is how much of the market there is for HP-branded developer laptops, given the long and deep involvement of other vendors in this area. Consider that Dell has been providing developer-focused Linux solutions for more than a decade, King observed.

Provides Dell XPS 13 and Latitude laptops and fixed and mobile Precision workstations with Ubuntu Linux pre-loaded and certified for Red Hat Linux. In 2020 Lenovo expanded access to its Linux-ready solutions that were previously only available as a special order to enterprise customers. The expanded product range includes more than two dozen ThinkPad laptops, ThinkStation PCs and ThinkStation Workstations.

A handful of specialty OEMs, including System76, are full in this space, King observed.

“Overall, it qualifies as HP moves from twirling its toes to its ankles in the developer endpoint market. Depending on how it finds water, HP may eventually take a deep breath. may and may dive,” predicted the king.

about hardware

The HP Dev One is a premium laptop built for coding. It is not designed for casual computing.

HP’s new Linux-based laptop is built for the way software developers work. It is equipped with 8-core AMD Ryzen 7 PRO 5850U Processor 1 and AMD Integrated Radeon Graphics.

The Dev One’s internal specifications provide much more power than laptops developed for casual and business computing tasks in general. It is packed with 16 GB RAM memory provided by DDR4 @ 3200MHz and offers 1TB PCIe NVMe 3×4 NVMe M.2 2280 solid state drive storage. Its Full HD display shines with 1,000 nits brightness.

Multicore processors are designed to improve the performance of certain software products. Although not all customers or software applications will benefit from the use of this technology, HP said. Performance and clock frequency vary depending on application workload, hardware and software configuration. AMD’s numbering is not a measure of clock speed.

HP Dev One Ports Side View

The Dev One maintains HP’s classic layout with ample ports on both the left and right sides of the 3.24-pound lightweight 14-inch mineral silver-colored clam shell.


For software developers, however, multicore performance allows coders to seamlessly multitask between the IDE and photo editing software while testing their releases. The 16 GB memory supply provides transfer rates of up to 3,200 MT/s of speed and response for developers.

Greater storage size and speed means developers can spend less time managing their files. High speed sequential transfer – up to three Gb / s – makes it possible to experience very fast loading and saving of files.

The Dev One laptop measures 12.73 x 8.44 x 0.75 inches (32.34 x 21.46 x 1.91 cm).

Giving birth to a collaboration to develop Linux

A group of HP engineers contacted System76 about the possibility of installing Pop!_OS on one of their laptop computers. According to a spokesperson for System76’s public relations department, after some initial discussions, the two companies saw the potential for a real win, if they made Pop! _OS and Linux work together to bring a wider audience and allow HP to break into a whole new segment.

“The rest, as they say, is history,” the spokesperson told LinuxInsider.

But even casual coders and non-professional users can download a free open-source operating system without spending cold cash for a top-end laptop. The version of Pop!_OS that comes with the HP Dev One is the version that will be available for download on the System76 website.

There is no specially modified software version available. Users can freely download and install any Linux distribution. Linux runs on a variety of hardware configurations. It breathes new life into older computers, especially those that can no longer run current versions of Microsoft Windows.

The added advantage of Pop!_OS is its optimized User Interface (UI) which makes it simple and exceptionally intuitive to use.

“There is no doubt that HP has a far reaching reach in terms of its audience. By bringing Linux into its portfolio as a viable option for its customers, it also wants Linux and Pop!_OS to reach a larger audience. Allows, ”said the spokesperson.

Matter of time

Only time will tell how successfully the HP-System76 partnership will drive Linux adoption. History shows that the lack of coordinated advertising and some of the major OEM providers of hardware preinstalled with Linux have slowed mainstream Linux desktop adoption.

“At this point in time, it is too early to say. We think this is definitely linked to the above question, however, a larger audience learning about the benefits of Linux will lead to greater adoption of the platform in time, ”According to System 76.

But the collaboration with HP has actually greatly expanded the potential of the System 76, offered the spokesperson.

Pop!_OS Edge

System76 POP!_OS is not a skinned version of Ubuntu GNOME as a replacement. It includes much more.

System76 has an impressive track record in pioneering this optimized Linux operating system. This created a uniquely branded GNOME-based desktop environment designed for the company’s own hardware.

The collaboration with HP fixes both the hardware and the software so that the computing platform isn’t available anywhere else. Seasoned Linux users have many reasons to be attracted to the POP!_OS integration of the GNOME desktop.

Selecting this unique Linux desktop System76 emphasizes continuous improvements to the GNOME UI. Optimized special features can make this collaborative effort a winning proposition for coders and related industry settings.

Pop!_OS version 22.04 LTS is designed to have a minimal amount of clutter on the desktop to eliminate distractions. The layout lets users focus completely on using it more productively.

The latest POP!_OS System76, released prior to the Dev One announcement with HP, added the ability to assign applications to run on a specific graphics card. In addition to switching between Intel and Nvidia graphics, users can choose a hybrid graphics mode. In this mode, the computer runs on a battery-saving Intel GPU and uses only the Nvidia GPU for user-specified applications.

Extended keyboard shortcuts create a fluid experience. It’s a refreshing way to navigate the desktop without emptying the keyboard rows to perform mouse actions. These new keyboard shortcuts let you launch applications and switch between them, toggle settings, and more. It should work well for coders.

get it and more

HP Dev One is now available with a starting price of US$1,099.

This laptop comes with full-disk encryption, hall sensor and ambient light sensor. It also gets a dual-point backlit spill-resistant premium keyboard with a glass click pad and gesture support by default.

Wireless connectivity includes Realtek RTL8822CE 802.11a/b/g/n/ac (2×2) Wi-Fi and Bluetooth 5 combo. There is no fingerprint reader in this device.

Audio configurations include dual stereo speakers and two multi-array microphones. The power supply is a HP Smart 65W External AC Power Adapter. Battery type is an HP Long Life three-cell, 53 watt Li-ion.

Ports and connectors include two SuperSpeed ​​USB Type-C 10Gbps signaling rates (USB Power Delivery, DisplayPort 1.4); two SuperSpeed ​​USB Type-A 5Gbps signaling rates (one charging); a headphone/microphone combo; One HDMI 2.0; One AC power (HDMI cable sold separately).

It also includes a 720p HD camera.

For more information or to order visit hpdevone.com.

POP!_OS Distro is available for free download in two versions. An ISO is for Intel and AMD systems. The second ISO is for the Nvidia graphics system.

Both installation ISOs boot the computer in a live session that does not change the current operating system or the computer’s hard drive. It is set up with the click of a button from the live session.

The cost of cleaning up data is often beyond the comfort zone of businesses full of potentially dirty data. This paves the way for reliable and compliant corporate data flows.

According to Kyle Kirwan, co-founder and CEO of data observability platform BigEye, few companies have the resources needed to develop tools for challenges such as large-scale data observability. As a result, many companies are essentially going blind, reacting when something goes wrong instead of continually addressing data quality.

A data trust provides a legal framework for the management of shared data. It promotes cooperation through common rules for data protection, confidentiality and confidentiality; and enables organizations to securely connect their data sources to a shared repository of data.

Bigeye brings together data engineers, analysts, scientists and stakeholders to build trust in data. Its platform helps companies create SLAs for monitoring and anomaly detection and ensuring data quality and reliable pipelines.

With full API access, a user-friendly interface, and automated yet flexible customization, data teams can monitor quality, consistently detect and resolve issues, and ensure that each be able to rely on user data.

uber data experience

Two early members of the data team at Uber — Kirvan and bigeye co-founder and CTO Egor Gryznov — set out to use what they learned to build Uber’s scale to build easy-to-deploy SaaS tools for data engineers. prepared for.

Kiran was one of Uber’s first data scientists and the first metadata product manager. Gryaznov was a staff-level engineer who managed Uber’s Vertica data warehouse and developed a number of internal data engineering tools and frameworks.

He realized that his team was building tools to manage Uber’s vast data lake and the thousands of internal data users available to most data engineering teams.

Automatically monitoring and detecting reliability issues within thousands of tables in a data warehouse is no easy task. Companies like Instacart, Udacity, Docker, and Clubhouse use Bigeye to make their analysis and machine learning work consistently.

a growing area

Founding Bigeye in 2019, he recognized the growing problem of enterprises deploying data in operations workflows, machine learning-powered products and services, and high-ROI use cases such as strategic analysis and business intelligence-driven decision-making.

The data observability space saw several entrants in 2021. Bigeye differentiates itself from that pack by giving users the ability to automatically assess customer data quality with over 70 unique data quality metrics.

These metrics are trained with thousands of different anomaly detection models to ensure data quality problems – even the most difficult to detect – are ahead of data engineers ever. Do not increase

Last year, data observability burst onto the scene, with at least ten data observability startups announcing significant funding rounds.

Kirwan predicted that this year, data observation will become a priority for data teams as they seek to balance the demand for managing complex platforms with the need to ensure data quality and pipeline reliability.

solution rundown

Bigeye’s data platform is no longer in beta. Some enterprise-grade features are still on the roadmap, such as full role-based access control. But others, such as SSO and in-VPC deployment, are available today.

The app is closed source, and hence proprietary models are used for anomaly detection. Bigeye is a big fan of open-source alternatives, but decided to develop one on its own to achieve internally set performance goals.

Machine learning is used in a few key places to bring a unique mix of metrics to each table in a customer’s connected data sources. Anomaly detection models are trained on each of those metrics to detect abnormal behavior.

Built-in three features in late 2021 automatically detect and alert data quality issues and enable data quality SLAs.

The first, deltas, makes it easy to compare and validate multiple versions of any dataset.

Issues, second, brings together multiple alerts at the same time with valuable context about related issues. This makes it easier to document past improvements and speed up proposals.

Third, the dashboard provides a holistic view of the health of the data, helps identify data quality hotspots, close gaps in monitoring coverage, and measures a team’s improvement in reliability.

eyeball data warehouse

TechNewsWorld spoke with Kirwan to uncover some of the complexities of his company’s data sniffing platform, which provides data scientists.

TechNewsWorld: What makes Bigeye’s approach innovative or cutting edge?

Kyle Kiran Bigey Co-Founder and CEO
Kyle Kiran, BigEye Co-Founder and CEO

Kyle Kiran: Data observation requires a consistent and thorough knowledge of what is happening inside all the tables and pipelines in your data stack. It is similar to SRE [site reliability engineering] And DevOps teams use applications and infrastructure to work round the clock. But it has been repurposed for the world of data engineering and data science.

While data quality and data reliability have been an issue for decades, data applications are now important in how many major businesses run; Because any loss of data, outage, or degradation can quickly result in loss of revenue and customers.

Without data observability, data dealers must continually react to data quality issues and entanglements as they go about using the data. A better solution is to proactively identify the problems and fix the root causes.

How does trust affect data?

Ray: Often, problems are discovered by stakeholders such as executives who do not trust their often broken dashboards. Or users get confusing results from in-product machine learning models. Data engineers can better get ahead of problems and prevent business impact if they are alerted enough.

How does this concept differ from similar sounding technologies like Integrated Data Management?

Ray: Data observability is a core function within data operations (think: data management). Many customers look for best-of-breed solutions for each task within data operations. This is why technologies like Snowflake, FiveTran, Airflow and DBT are exploding in popularity. Each is considered an important part of the “modern data stack” rather than a one-size-fits-none solution.

Data Overview, Data SLA, ETL [extract, transform, load] Code version control, data pipeline testing, and other techniques must be used to keep modern data pipelines working smoothly. Just like how high-performance software engineers and DevOps teams use their collaborative technologies.

What role do data pipelines and dataops play with data visibility?

Ray: Data Observability is closely related to the emerging practice of DataOps and Data Reliability Engineering. DataOps refers to the broad set of operational challenges that data platform owners will face. Data Reliability Engineering is a part, but only part, of Data Ops, just as Site Reliability Engineering is related but does not include all DevOps.

Data security can benefit from data observation, as it can be used to identify unexpected changes in query volume on different tables or changes in the behavior of ETL pipelines. However, data observation by itself will not be a complete data protection solution.

What challenges does this technology face?

Ray: These challenges include issues such as data discovery and governance, cost tracking and management, and access control. It also includes how to handle queries, dashboards, and the growing number of ML features and models.

Reliability and uptime are certainly challenges many DevOps teams are responsible for. But they are also often charged for other aspects such as developer velocity and security reasons. Within these two areas, data overview enables data teams to know whether their data and data pipeline are error free.

What are the challenges of implementing and maintaining data observability technology?

Ray: Effective data observability systems must be integrated into the workflows of the data team. This enables them to continuously respond to data issues and focus on growing their data platform rather than putting out data fires. However, poorly tuned data observability systems can result in a flood of false positives.

An effective data system should perform more maintenance than just testing for data quality issues by automatically adapting to changes in the business. A poorly optimized data observation system, however, may not be accurate for changes in business or more accurate for changes in business that require manual tuning, which can be time-consuming.

Data observability can also be taxing on a data warehouse if not optimized properly. Bigeye teams have experience in optimizing large-scale data observation capability to ensure that the platform does not impact data warehouse performance.

Titan Linux is not an operating system that casual Linux users – especially new adopters – should have installed on their primary or only computer. But seasoned Linux distribution hoppers in search of a pleasant new Linux experience shouldn’t pass up the new offering.

Titan is a new distro built on the Debian stable branch. The developers first announced its arrival on April 24. This is a very early beta release, so it’s mostly bare bones. Nevertheless, it is surprisingly very stable given this stage of its development.

I looked at version 1.2 and found little things about its performance. The new distro’s two-person developer team has a growing community of testers for such new projects; Around 60 on the last count.

Usually, such small start-up teams cannot keep up with the further progress and often Linux distros fall by the wayside. But I am impressed by the achievements of this team so far.

Project leader Matthew Moore readily admits that the success or failure of the new distro will depend on user acceptance and a supportive community. One of the biggest adoption challenges facing Titan Linux is that with no ads or reviews (so far), it’s difficult to attract the risk of potential users.

Progress and updates come almost daily. So I would expect Titan to mature more quickly than it usually does with fledgling releases.

This distro is a fully functional yet minimal KDE Plasma desktop experience with an emphasis on usability and performance. It already has a wide range of hardware support out of the box.

Titan Linux takes a unique approach to the Debian experience. This eliminates the dependency on certain meta-packages to make the system a more stable overall.

something old is turning into something new

KDE is a comprehensive desktop environment that offers users a plethora of customization options. It is also a Linux staple that is popular and reliable. However, KDE may put off new users due to its complexity and quirks.

I’ve used KDE Plasma with several distros over the years. I first tried it when the old KDE desktop turned out to be a revitalized KDE Plasma upgrade. Some of its user interface (UI) issues got in my way as a daily driver.

If I see Titan moving beyond beta releases, Titan Linux with KDE might make me a happy user again. It all comes down to usability.

work in progress

Until now, developers trimmed the fat from KDE Plasma to make it less complicated without endless customization options. That’s the point of this distro.

In addition to simpler, lighter means in the long run, the Titan could attract a larger user with aging and less powerful computers. Keeping KDE as streamlined as possible while offering full hardware support from the Debian catalog are welcome performance goals.

Titan Linux offers something a little more slim than the standard Debian. But according to Moore, it’s more useful than a standard Debian Net installation.

Customization is not a bad thing. Linux thrives on having the freedom to customize, tweak, and create a desktop environment suited to individual user preferences.

Part of the simplification is an innovative Titan Toolbox – a work in progress but very promising – by head developer Cobalt Rogue. This set of system management tools will let users maintain the OS with a single click. The toolbox will include a range of software apps hardwired to the Titan’s distinctive design, rather than a one-size-fits-all Debian Linux component.

sharing insider ideas

If you want to find out how Sausage is made, check out the developer’s website for links to both Moore and Cobalt Rogue’s YouTube videos on building Titan Linux. They both provide live stream discussion of their development efforts.

It is practical to observe conversations that focus on the goals of the team. A leading man doesn’t want Titan Linux to be just another remix. Moore plans to grow its new distribution into a unique offering with meaningful features.

In a recent video, Moore explained why he decided to build Titan Linux on Debian instead of Arch, which he used to use before. This is because Debian’s longevity between stable releases is more conducive to rapid beta releases.

Debian has long release cycles – in the neighborhood of two years – so Titan’s development doesn’t break because the base components change frequently. Arch distros are very erratic with rolling releases which often break systems.

Leaner KDE Deployed

KDE is the moniker for the K desktop environment introduced in 1996. It is a reference to the organization sponsoring the development of the K desktop and the family of software that runs on its K desktop, as well as other desktops.

When the KDE community released a major upgrade from KDE 4, the developers dubbed the new desktop upgrade to KDE 5 under the name “Plasma”. That name reflected the radical redesign and functionality changes as a type of KDE rebranding.

Various Linux distros are built around the KDE project. For example, Kubuntu Linux is a version of the Ubuntu family of OSes that uses the KDE desktop. Other popular distros running the KDE desktop environment include KaiOS, Manjaro KDE, Fedora KDE Spin, MX Linux KDE, and Garuda Linux.

What makes this brand new Titan Beta OS so remarkable to me is the potential of what it offers. It can make K Desktop more productive with streamlined features and better usability.

However, offering a stripped-down version of the KDE desktop isn’t a unique idea in itself. Many other Linux developers have tried to turn KDE into a better working desktop. Some even gave it a new name.

Making a Better K Desktop, Again

Among the hundreds of Linux distributions I’ve reviewed over the years, some of the improvement efforts differ. Looking at literally hundreds of similar looking Linux distros, rebuilding KDE is rarely productive.

Few desktop environments – and Linux is both blessed and damned – can be inviting enough to meet the computing needs of all user scenarios. KDE attempts to do the same.

Consider these examples:

  • In late 2019 Feren OS switched from a Cinnamon desktop and a Linux Mint base to a KDE Plasma and Ubuntu base.
  • The KDE Neon distro – not called Plasma – is something unique. It has KDE components that have not yet been absorbed by other KDE-based distros. It is based on Ubuntu (which itself is based on Debian Linux).
  • The KaiOS Linux distro provides a UI-refreshed KDE-based computing platform. It provides better KDE experience without bloated software and cumbersome usability.
  • The Vector Linux family is a small, fast, and lightweight Slackware-based distribution that ships a customized version of KDE to be more user-friendly than other Slackware-style distros.

A glimpse of Titan’s potential

The early beta releases of the new Titan distro are like a partially loaded framework. Sectional headings and their supporting elements are enough to get a solid reading of the big picture.

The main parts are in place and working. But many vacancies are still to be filled. The OS works well with the space it has. It will work even better when more innovative parts are written in it.

This view of the Titan Linux desktop shows the two main KDE elements – access to the virtual desktop via the lower panel and the unique Activity layout accessed via a pop-out vertical left column that provides another kind of virtual computing space Is.


Widget Popup Panel Display of Screen and Panel Apps Adds a variety of services and features to the desktop layout.


Pictured in the top left is the information display of the Terminal window with the Command Line Interface (CLI). On the right is the Software Store window that provides the ability to add/remove a complete list of Debian Linux software, even in this early beta view.


Here the simplified system settings panel in Titan Linux is shown.


ground level

Beta versions of Titan Linux are releasing at a rapid pace. This development schedule heats up anticipation for the first stable release.

The KDE Plasma desktop design found in current Linux distros is not lightweight. Beta version 1.2 consumes 450MB of RAM, making this anticipated new distro much lighter. This means two things: More aging computers running Titan OS may get a revival; And newer computers may outperform the more standard KDE integration.

The Live Session ISO is upgraded several times per week as developers push the envelope to release the first stable version and beyond. The live session environment lets you try out Titan Linux beta releases without making any changes to your current OS or hard drive.

The beta version I tested is already performing surprisingly well. More features and UI changes appear with each new ISO download.

Check it out for yourself on the Titan Linux website.


suggest a review

Is there a Linux software application or distro that you would like to recommend for review? Something you love or want to know?

Email me your thoughts and I’ll consider them for future columns.

And use the Reader Comments feature below to provide your input!

Microsoft Build is Microsoft’s most interesting event because it focuses on the people who build stuff, mostly code, but often, as is the case this year, hardware.

Last week, Microsoft held its latest build event and I’m pretty sure it screwed up most PC OEMs. That’s because Microsoft announced a new focused workstation for developers called Project Volterra. It has four processors and is based on ARM, not x86, and this coupled with a major effort to provide ARM native code will help that platform with the help of Qualcomm once the code becomes available in late 2022. Allows you to reach your full potential.

But ARM is only one of four processors. We still have the GPU, but Microsoft added an NPU and an ACU (Azure Compute Unit), and that last one isn’t even in the PC. Let’s talk about how Microsoft is radically rethinking PCs in the cloud world, and how disruptive this necessary change is likely to be.

Then we’ll close with our product of the week, which has to be Project Volterra because it reminds me of the old PCJR from IBM but done right. (IBM crippled the IBM PCJR because they feared it would cancel sales of their IBM PC, which is now a textbook product mistake.)

Inside 4-processor PC

Today, PCs consist of two processors, a CPU that handles numerically related information, and a GPU that focuses more on unstructured data and visual information. Together they define how a PC performs, with the current trend being load transfer from CPU to GPU as they are increasingly less structured and more visually focused, especially when it comes to how PCs store their information. Present.

But with the rise of artificial intelligence – and the fact that AI operates very differently than apps designed for CPUs or GPUs, by creating decision chains based on neural network capabilities we consider how our brains work. does – these loads operate inefficiently on the CPU, and although more efficiently on the GPU, begging for a very different hardware architecture designed specifically for those workloads.

Enter the NPU or Neural Processing Unit. On paper this can outperform both CPU and GPU with AI load with far less power and open the door for developers who want to build applications that can utilize a focused and more efficient AI processing platform . This means there will be a lot of focus on AI capabilities going forward, and Microsoft has said that, in the future, all PCs will have NPUs.

But what about the APU? Well, that’s an acronym I came up with. APU stands for Azure Processing Unit. This is the second shoe we have been waiting for since Satya took over Microsoft. This refers to a persistent connection to Azure in the cloud for additional processing power. This is actually the first hardware implementation on an endpoint that addresses the hybrid world we live in today.

By hybrid I do not mean work from home and office, although it does apply to the world we are in today. Nor does it apply to the hybrid cloud as we talk about it currently which has to do with server load. It is a new hybrid concept, where the load is transferred between the cloud and the desktop as needed.

Like PCjr – but in a good way

Project Voltera is a new class of workstation with all four processors based on ARM and focused on developers who develop for ARM-based PCs. As I mentioned earlier, it reminds me of PCJR (pronounced “PC Junior”) from IBM in the 1980s but done right.

The PCJR was a revolutionary modular design that was incredibly well priced for the time and provided an easy upgrade path that would have anticipated the coming PC-as-a-service concept decades later.

But someone in the IBM plan raised concerns that the PCJR, which was targeted at consumers, was too good because it made the much more expensive IBM PC older and more expensive. So, they crippled PCJR and effectively killed him, leaving them to learn the lesson that you never Crippling a product because it’s too good. If customers like it, you focus on that preference to ensure that customer needs are prioritized over revenue.

Which brings us back to Project Volterra. It seems to be a high-performance desktop workstation that can be built for much less cost than a traditional workstation. Moreover, PCJR is stackable to add performance like modular. But the most important thing is that it is not crippling. While it initially focused on building ARM native apps, it anticipates a future where those apps are prevalent and can perform in line with their older x86 versions.

This is a major problem for ARM PCs – that they must run under emulation and thus operate inefficiently, allowing them to perform poorly against x86 PCs – and make them compete with x86 on an even more playing field. enables to do. None of these are on the market yet and the wave they are building for is still many years out. As we approach 2025, I expect that ARM-based PCs and workstations with all these advantages will be able to compete by that time.

wrapping up

Microsoft has been one of those companies that drives personalized technology and has revolutionized it from time to time. The move to four-processor PCs with one processor in the cloud and another focused on AI load is one of the biggest hardware changes since PCs were launched. Demonstrating its in-depth knowledge of what the market wants, Microsoft gives us a view of its PC future by what it means and the need for a widespread cloud connection.

Now we can look forward to the coming world of hybrid desktop apps, NPCs (non-player characters) that are just like real people in games, and supporting apps on PC that help us achieve productivity gains We can’t even dream today.

Promising increased collaboration capabilities not only with our peers but also with more intelligent computers that can move and drive our projects, Microsoft Build this year is a very different workplace, a very different employee tool. Set and evolve expects hardware that can look and function very differently from the PCs we have today.

In short, to say that Microsoft Build was disruptive this year would be an understatement.

Technical Product of the Week

Project Volterra

The Surface line of PCs, targeted specifically at Apple, lacked a workstation or a general desktop PC-class product from the start. They have an all-in-one PC that they pit against a manufacturer user, but it lacks the focused processing performance of a workstation. With the announcement of Project Volterra, that will change.

Project Volterra

Project Volterra | Image Credits: Microsoft


While Microsoft showcased a desktop configuration, the form factor presupposes a laptop version – but given the parallel advent of head mounted displays, that laptop could also be a revolutionary design we won’t see until This platform should not be close to launch.

Initially, Project Volterra will not target traditional workstation workloads such as CAD/CAM architecture or large-scale modeling, but it will focus on an area that has had little workstation support so far, and that is ARM-based, high-performance apps. Which run natively on Windows and ARM without emulation.

But think of it only as a step. Once those apps exist, use workstations like Project Voltera will move into more traditional areas after going through the required certifications, and of course, when they can run the respective applications natively.

Project Volterra is on a critical path to make ARM a true peer to x86, and to create a new class of PC that embraces AI and the cloud more deeply than ever before, making it my Product of the Week Makes an ideal candidate.

Plus, it was one of the most surprising things — if not all — announced at Microsoft Build this year.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.

The director of cyber security at the National Security Agency inspired some smiles among cyber professionals last week when he told Bloomberg that the new encryption standards his agency is working with the National Institute of Standards and Technology (NIST) will have no back doors. . ,

In cyber security parlance, a backdoor is an intentional flaw in a system or software that can be secretly exploited by an attacker. In 2014, it was rumored that an encryption standard developed by the NSA included backdoors, resulting in the algorithm being dropped as a federal standard.

“Backdoors can aid law enforcement and national security, but they also introduce vulnerabilities that can be exploited by hackers and are subject to potential abuse by the agencies they are intended to assist,” John Gunn, CEO of Rochester, NY-based Token, maker of a biometric-based wearable authentication ring, told TechNewsWorld.

“Any backdoor into encryption can and will be discovered by others,” said principle threat hunter John Bumbank of Netenrich, an IT and digital security operations company in San Jose, Calif.

“You can trust the American intelligence community,” he told TechNewsWorld. “But will you trust the Chinese and the Russians when they get to the back door?”

trust but verify

Lawrence Gasman, president and founder of Inside Quantum Technology of Crozet, Va., said the public has good reason to be skeptical about NSA officials’ comments. “The intelligence community is not known for telling the absolute truth,” he told TechNewsWorld.

Mike Parkin, an engineer at Vulcan Cyber, said, “The NSA has some of the best cryptographers in the world, and well-founded rumors have circulated for years about their efforts to put backdoors into encryption software, operating systems, and hardware. ” SaaS provider for enterprise cyber-risk treatment in Tel Aviv, Israel.

He told TechNewsWorld, “Similar things can be said of software and firmware sourced from other countries, which have their own agencies with a vested interest in seeing that a network has What’s in the crossing traffic.”

“Whether it’s in the name of law enforcement or national security, officials have a long-standing disdain for encryption,” he said.

When it comes to encryption and security there should be a trust but verified approach, advised Dave Kundiff, CISO at Cyvatar, creator of an automated cybersecurity management platform in Irvine, Calif.

“Organizations may have the best of intentions, but fail to fully see those intentions,” he told TechNewsWorld. “Government entities are bound by law, but do not guarantee that they will not knowingly or unintentionally introduce backdoors.”

“It is imperative for the community at large to test and verify any of these mechanisms to verify that they cannot be compromised,” he said.

taming prime numbers

One of the drivers behind the new encryption standards is the threat of quantum computing, which has the potential to break the commonly used encryption schemes used today.

“As quantum computers become mainstream, this will make modern public-key encryption algorithms obsolete and insufficient security, as demonstrated in Shor’s algorithms,” said Jasmine Henry, JupiterOne’s director of field security, Morrisville, cyber asset management. K’s North Carolina-based provider explained. and governance solutions.

Shor’s algorithm is a quantum computer algorithm for computing the prime factors of integers. Prime numbers are the foundation of the encryption used today.

“The encryption depends on how hard it is to work with really large prime numbers,” Parkin explained. “Quantum computing has the ability to find prime numbers that rely on encryption trivial. What used to take generations to compute on a traditional computer is now revealed in a matter of moments.”

This is a major threat to today’s public key encryption technology. “This is the reason why public-key cryptography is often used to supersede ‘symmetric’ key encryption. These keys are used for the transmission of sensitive data,” explained Andrew Barratt, at Coalfire The leading, Westminster, Colorado-based provider of cyber security advisory services for solutions and investigations.

“This has important implications for almost all encryption transmissions, but also for anything else that requires digital signatures such as the blockchain technologies that support cryptocurrencies like bitcoin,” he told TechNewsWorld.

Quantum Resistor Algorithm

Gunn said that most people misunderstand what quantum computing is and how it differs from today’s classic computing.

“Quantum computing will never be in your tablet, phone or wristwatch, but for tasks like searching and factoring large prime numbers using special algorithms for specific applications,” he said. “Performance improvements are in the millions.”

“Using Shor’s algorithm and the quantum computer of the future, AES-256, the encryption standard that protects everything on the web and all of our online financial transactions, will be breakable in a short period of time,” he said.

Barratt stressed that once quantum computing becomes available for mainstream use, crypto will need to move from prime-number-based mathematics to elliptic curve cryptography-based (ECC) systems. “However,” he continued, “it is only a matter of time before the underlying algorithms that support ECC become vulnerable on the scale of quantum computing, especially by designing quantum systems to break them.”

NIST is developing quantum-resistant algorithms with the help of the NSA. “The requirements for quantum-resistant algorithms may include very large signatures, loads of processing, or massive amounts of keys that can present challenges for implementation,” Henry told TechNewsWorld.

“Organizations will face new challenges to implement quantum-resistant protocols without running into performance issues,” she said.

time of arrival?

It is unclear when a working quantum computer will be available.

“It doesn’t appear that we’ve hit the inflection point in practical application, yet haven’t been able to say with any certainty what the timeline is,” Kundiff said.

“However, that inflection point may be tomorrow, allowing us to say that quantum computing will be widely available in three years,” he told TechNewsWorld, “but until there is some point to move beyond the theoretical and practical.” No, even then it is possible a decade away.”

Gassman said he thinks the world will soon see quantum computers. “Quantum computer companies say this will happen in 10 years to 30 years,” he observed. “I think it will be before 10 years, but not before five years.”

Moore’s law – which predicts that computing power doubles every two years – does not apply to quantum computing, Gassmann maintained. “We already know that quantum evolution is proceeding at a rapid pace,” he said.

“I’m saying we’ll have a quantum computer sooner than 10 years later,” he continued. “You won’t find many people agreeing with me, but I think we should be concerned about it right now – not only because of the NSA, but because there are worse people than the NSA who want to take advantage of this technology. “