Tag

big

Browsing

AMD, on November 3 in Las Vegas, held a launch event to remove the wrapper of its next-generation desktop graphics family. The event was a refreshingly concise but powerful overview of the innovation that AMD continues to bring to the graphics space.

In a bizarre yet fascinating way, it was interesting to see that the ho-hum or negative pre-launch rumor mill feedback given by the leaker paparazzi was way off base as these pundits didn’t have post-launch access. Information

What AMD announced was really surprising and impressive in a number of ways. It’s also a testament to the fact that AMD’s return to silicon glory, largely due to CEO Lisa Su’s leadership, has turned the company into an execution juggernaut since her arrival in October 2014, only eight years ago. It’s worth pointing out what I believe are the three biggest takeaways from AMD’s announcement.

Takeaway #1: RDNA 3 is a remarkable opportunity for AMD to disrupt the graphics space.

If AMD’s performance claims are to be believed — and industry benchmark experts will be the judge in the coming weeks — RDNA 3 will potentially disrupt Nvidia’s 80% share in the desktop graphics space.

Remember that AMD claims that RDNA 3 provides an additional 50% to 70% performance improvement over what Nvidia accomplished with its RTX 4090.

More importantly, RDNA 3 is the first realization of AMD’s “chiplet” design capability in the graphics area. The chiplet approach, which enabled AMD to turn its Ryzen processor family into a highly capable and more affordable offering against Intel’s processors, consists of a 5 nm compute processor that resides on an interposer with six memory caches and internally Lowers cost.

AMD RDNA 3 Chipset Process Overview

RDNA 3 Chiplets Process Overview (Image Credit: AMD)


Of course, this comes at the expense of spec clock speed and overall pure performance at the silicon level. But the more important point is that while the architecture features a 2.3GHz core clock speed that is only marginally faster than RDNA 2, this design technique is groundbreaking in the graphics space, and has the potential for future products that scale dramatically. But will champion the gamers. bit.

Truth be told, beyond delivering a message about AMD’s pioneering chiplet design, RDND 3 can be seen as a fine-tuning of RDNA 2. In other words, AMD is bringing more compute power to the table with the RDNA 3, but the compute units look noticeably smaller. Similar to the company’s previous generations of graphics architecture.

Still, that doesn’t diminish what AMD has achieved with the RDNA 3. The silicon game is all about reducing die size to reduce costs and system wattage requirements, and the RDNA 3 has all of that in spades.

Takeaway #2: The RDNA 3 will turn Nvidia on its head when it comes to pricing.

Anyone covering the PC gaming space knows that the winners are the companies that can offer a compelling value proposition mix of performance and competitive pricing. Value has always been a powerful component of AMD’s brand DNA, so RDNA 3 represents a substantial opportunity to disrupt things in the graphics space.

After all, Nvidia’s premium offerings — like its upcoming RTX 4090 with 24GB of GDDR6X VRAM — cost $1,699. AMD spooked onlookers by announcing the pricing of its new flagship cards, the Radeon RX 7900 XTX and Radeon RX 7900 XT, at less than $1,000 each ($999 and $899, respectively).

To be sure, the company positioned its new graphics cards at the event against Nvidia’s “mid-range” RTX 4080 (which carries a healthy $1,199 MSRP), presumably because AMD thinks it can drive meaningful volume. This is the correct interrupt point for . In my view, this is a wise marketing move for AMD because if the benchmark data pans out, the company could gain considerable traction in Nvidia’s wheelhouse.

One more point deserves mention. Nvidia’s RTX 4XXX cards have always had a reputation with some gamers for being huge and requiring a power supply upgrade. AMD’s newer cards only require two eight-pin inputs because their power requirements are lower than those of the latest Nvidia cards.

This observation begs the question of whether the RDNA 3 architecture can be used with larger power supplies and even greater performance. But Tape’s story will be the early review that will assess just how efficient these new AMD cards are.

Takeaway #3: The AMD Advantage Could Be a Game Changer for “DIY Wannabes”.

As the marketing executive who launched Dell’s XPS gaming brand in the early 2000s to the likes of Alienware (prior to Dell’s acquisition), I’ve had the pleasure and accomplishment of building many gaming rigs over the years. I can talk in the spirit of

However, I can testify that building a gaming PC is not for the faint of heart. Despite the thrill of picking out the exact components I wanted (eg, motherboard, processor, graphics card, chassis, power supply, etc.), I always found myself frustrated with the actual building of a DIY setup.

Amusingly, I always had leftover parts and screws that I somehow didn’t use, and my cable management skills were less than elegant. What’s more, don’t get me started on how DIY systems can be a nightmare to upgrade from a firmware and driver perspective because there are so many different components involved.

The AMD Advantage hopes to solve that. The company is using its extensive expertise and know-how, derived from extensive research, to create what it calls a “user-centered” system that has all the benefits of a DIY system without the hassle.

To be clear, AMD isn’t entering the whole system PC business. Replicating the approach successfully adopted with AMD Advantage laptops, AMD Advantage is a “framework” program that essentially certifies desktop systems from major system integration partners.

These certified desktop PCs will be optimized for use with AMD’s top-of-the-line Ryzen 7950x processor and Radeon XTX 7900 graphics card. AMD’s Adrenalin software will provide enhanced performance and manage firmware/driver updates. But most importantly, these certified systems will be designed in a highly customizable way that should make PC upgrades painless and more future-proof.

The new AMD Advantage desktop systems are expected to be available soon at CSL, CyberPower, eBuyer, Falcon, Northwest, Maingear, Origin PC, and Xidax.

analyst comments

There’s a lot to like in what AMD announced last week. While the growth of the overall PC market is slowing down sharply after a two-year Covid-19 binge, AMD’s new graphics cards have the potential to drive growth in the PC gaming segment.

In addition, the AMD Advantage program could expand the market with “DIY wannabes” who dreamed about building their own PCs but didn’t have the time or technical skills to do so.

While the PC gaming space doesn’t have a natural launch cadence catalyst that operated as the consumer PC market did in the late 1990s and early 2000s, when the launch of a new Windows operating system could ignite sales, Other factors may be driving the growth.

Highly anticipated new games in 2023, such as Alan Wake 2, Aliens: Dark Descent, Ark 2, Assassin’s Creed: Mirage, and Atlas Fallen, could have an adrenaline-like effect on gaming PC sales, as 2022 was not a very exciting year for the game of.

Although AMD’s graphics market share may be in the 20% range, AMD PC gaming enthusiasts are among the most animated and energetic PC users I’ve encountered. My interviews with several AMD executives at the RDNA 3 event conveyed that vibrancy.

I was particularly taken by my brief discussion with HipHopGamer, a YouTube influencer who channeled the enthusiasm of the gamers in attendance at the event. People have a strong opinion about these graphics cards. You can see HipHopGamer’s excitement for yourself and my other interviews with AMD executives in this video:

While it’s difficult to assess the impact of these new AMD graphics cards until industry benchmarking experts compare these solutions to Nvidia’s offerings, credit AMD with keeping its innovation pedal to the metal, which should give Nvidia a boost. Must be kept on your toes.

Designed to power high frame rate 4K and high resolution gaming through its innovative chiplet implementation and second-generation ray-tracing capability, AMD’s RDNA 3-based graphics cards enable developers to bring immersive games to market Will do, which we could not even think of. Many years ago.

Creative professionals producing high-capacity digital video and multimedia content will also see possibilities with these new solutions. It’s a powerful message that I’d like to see AMD put more emphasis on marketing.

As criminal activity on the Internet continues to intensify, hunting bugs for cash is attracting more and more security researchers.

In its latest annual report, bug bounty platform Integrity revealed that there was a 43% increase in the number of analysts signing up for its services from April 2021 to April 2022. For Integrity alone, this means adding 50,000 researchers.

For the most part, it has been noted, bug bounty hunting is part-time work for the majority of researchers, with 54% holding full-time jobs and another 34% being full-time students.

“Bug bounty programs are tremendously successful for both organizations and security researchers,” said Ray Kelly, a fellow at WhiteHat Security, an application security provider in San Jose, Calif., which was recently acquired by Synopsis.

“Effective bug bounty programs limit the impact of serious security vulnerabilities that could easily have put an organization’s customer base at risk,” he told TechNewsWorld.

“Payments for bug reports can sometimes exceed six-figure amounts, which may seem like a lot,” he said. “However, the cost of fixing and recovering a zero-day vulnerability for an organization can total millions of dollars in lost revenue.”

‘Good faith’ rewarded

As if that weren’t incentive enough to become a bug bounty hunter, the US Department of Justice recently sweetened the career path by adopting a policy that said it would not enforce the federal Computer Fraud and Abuse Act against hackers, Who starred in “Good”. trust” when attempting to discover flaws in software and systems.

“The recent policy change to prevent prosecuting researchers is welcome and long-awaited,” said Mike Parkin, senior technical engineer at Vulcan Cyber, a provider of SaaS for enterprise cyber risk prevention in Tel Aviv, Israel.

“The fact that researchers have, over the years, tried to help and find the right security flaws under a regime that amounted to ‘doing no good’ suggests that it takes them to do the right thing.” There was dedication, even if doing the right thing meant risky fines and jail time,” he told TechNewsWorld.

“This policy change removes a fairly significant obstacle to vulnerability research, and we can expect it to pay dividends quickly and without the risk of jail time for doing it for bug discoverers in good faith.” Will pay dividends with more people.”

Today, ferreting out bugs in other people’s software is considered a respectable business, but it isn’t always the case. “Basically there were a lot of issues with when bug bounty hunters would find vulnerabilities,” said James McQuigan, a security awareness advocate at KnowBe4, a security awareness training provider in Clearwater, Fla.

“Organizations will take a lot of offense to this, and they will try to accuse the researcher of finding it when, in fact, the researcher wanted to help,” he told TechNewsWorld. “The industry has recognized this and now email addresses have been established to receive such information.”

benefits of multiple eyes

Over the years, companies have come to realize what bug bounty programs can bring to the table. “The task of discovering and prioritizing weak, unintended consequences is not, and should not be, the focus of the organization’s resources or efforts,” explained Casey Ellis, CTO and founder of BugCrowd, which operates a crowdsourced bug bounty platform. Is.

“As a result, a more scalable and effective answer to the question ‘where am I most likely to settle’ is no longer considered a good one, but should be one,” he told TechNewsWorld. “This is where bug bounty programs come into play.”

“Bug bounty programs are a proactive way to spot vulnerabilities and reward one’s good work and discretion,” said Davis McCarthy, a lead security researcher at Valtix, a provider of cloud-native network security services in Santa Clara, Calif.

“The old adage, ‘Many eyes make all the bugs shallow,’ is true, because there is a dearth of talent in the field,” he told TechNewsWorld.

Parkin agreed. “With the sheer complexity of modern code and the myriad interactions between applications, it’s important to have a more responsible eye on looking for flaws,” he said.

“Threat actors are always working to find new vulnerabilities they can exploit, and the threats scene in cyber security has only gotten more hostile,” he continued. “The rise of bug bounties is a way for organizations to bring some of the independent researchers into the game on their side. It’s a natural response to an increase in sophisticated attacks.”

Bad Actor Reward Program

Although bug bounty programs have gained greater acceptance among businesses, they can still cause friction within organizations.

“Researchers often complain that even when firms have a coordinated disclosure or bug bounty program, a lot of pushback or friction exists. Archie Agarwal, founder and CEO of ThreatModeler, an automated threat modeling provider in Jersey City, NJ “They often feel slighted or pushy,” he said.

“Organizations, for their part, often get stuck when presented with a disclosure because the researcher found a fatal design flaw that would require months of concerted effort to rectify,” he told TechNewsWorld. “Maybe some prefer that these kinds of flaws will be out of sight.”

“The effort and expense of fixing design flaws after a system has been deployed is a significant challenge,” he continued. “The surest way to avoid this is by creating threat model systems, and as their design evolves. It provides organizations with the ability to plan for and deal with these flaws in their potential form, proactively.” does.”

Perhaps the biggest proof of the effectiveness of bug bounty programs is that malicious actors have begun to adopt the practice. The Lockbit ransomware gang is offering payments to those who discover vulnerabilities in their leaked website and their code.

“This development is novel, however, I suspect they will get many takers,” predicts John Bumbaneck, principle threat hunter at Netenrich, a San Jose, Calif.-based IT and digital security operations company.

“I know that if I find a vulnerability, I’m going to use it to jail them,” he told TechNewsWorld. “If a criminal finds someone, it must be stealing from them because there is no respect among ransomware operators.”

“Ethical hacking programs have been hugely successful. It is no surprise to see ransomware groups refining their methods and services in the face of that competition,” said Casey Bisson, head of product and developer relations at BlueBracket, Menlo Park, Calif. A cyber security services company in India.

He warned that attackers are increasingly aware that they can buy access to the companies and systems they want to attack.

“It involves looking at the security of their internal supply chains every enterprise has, including who has access to their code, and any secrets therein,” he told TechNewsWorld. “Unethical bounty programs like these turn passwords and keys into code for whoever has access to your code.”

The cost of cleaning up data is often beyond the comfort zone of businesses full of potentially dirty data. This paves the way for reliable and compliant corporate data flows.

According to Kyle Kirwan, co-founder and CEO of data observability platform BigEye, few companies have the resources needed to develop tools for challenges such as large-scale data observability. As a result, many companies are essentially going blind, reacting when something goes wrong instead of continually addressing data quality.

A data trust provides a legal framework for the management of shared data. It promotes cooperation through common rules for data protection, confidentiality and confidentiality; and enables organizations to securely connect their data sources to a shared repository of data.

Bigeye brings together data engineers, analysts, scientists and stakeholders to build trust in data. Its platform helps companies create SLAs for monitoring and anomaly detection and ensuring data quality and reliable pipelines.

With full API access, a user-friendly interface, and automated yet flexible customization, data teams can monitor quality, consistently detect and resolve issues, and ensure that each be able to rely on user data.

uber data experience

Two early members of the data team at Uber — Kirvan and bigeye co-founder and CTO Egor Gryznov — set out to use what they learned to build Uber’s scale to build easy-to-deploy SaaS tools for data engineers. prepared for.

Kiran was one of Uber’s first data scientists and the first metadata product manager. Gryaznov was a staff-level engineer who managed Uber’s Vertica data warehouse and developed a number of internal data engineering tools and frameworks.

He realized that his team was building tools to manage Uber’s vast data lake and the thousands of internal data users available to most data engineering teams.

Automatically monitoring and detecting reliability issues within thousands of tables in a data warehouse is no easy task. Companies like Instacart, Udacity, Docker, and Clubhouse use Bigeye to make their analysis and machine learning work consistently.

a growing area

Founding Bigeye in 2019, he recognized the growing problem of enterprises deploying data in operations workflows, machine learning-powered products and services, and high-ROI use cases such as strategic analysis and business intelligence-driven decision-making.

The data observability space saw several entrants in 2021. Bigeye differentiates itself from that pack by giving users the ability to automatically assess customer data quality with over 70 unique data quality metrics.

These metrics are trained with thousands of different anomaly detection models to ensure data quality problems – even the most difficult to detect – are ahead of data engineers ever. Do not increase

Last year, data observability burst onto the scene, with at least ten data observability startups announcing significant funding rounds.

Kirwan predicted that this year, data observation will become a priority for data teams as they seek to balance the demand for managing complex platforms with the need to ensure data quality and pipeline reliability.

solution rundown

Bigeye’s data platform is no longer in beta. Some enterprise-grade features are still on the roadmap, such as full role-based access control. But others, such as SSO and in-VPC deployment, are available today.

The app is closed source, and hence proprietary models are used for anomaly detection. Bigeye is a big fan of open-source alternatives, but decided to develop one on its own to achieve internally set performance goals.

Machine learning is used in a few key places to bring a unique mix of metrics to each table in a customer’s connected data sources. Anomaly detection models are trained on each of those metrics to detect abnormal behavior.

Built-in three features in late 2021 automatically detect and alert data quality issues and enable data quality SLAs.

The first, deltas, makes it easy to compare and validate multiple versions of any dataset.

Issues, second, brings together multiple alerts at the same time with valuable context about related issues. This makes it easier to document past improvements and speed up proposals.

Third, the dashboard provides a holistic view of the health of the data, helps identify data quality hotspots, close gaps in monitoring coverage, and measures a team’s improvement in reliability.

eyeball data warehouse

TechNewsWorld spoke with Kirwan to uncover some of the complexities of his company’s data sniffing platform, which provides data scientists.

TechNewsWorld: What makes Bigeye’s approach innovative or cutting edge?

Kyle Kiran Bigey Co-Founder and CEO
Kyle Kiran, BigEye Co-Founder and CEO

Kyle Kiran: Data observation requires a consistent and thorough knowledge of what is happening inside all the tables and pipelines in your data stack. It is similar to SRE [site reliability engineering] And DevOps teams use applications and infrastructure to work round the clock. But it has been repurposed for the world of data engineering and data science.

While data quality and data reliability have been an issue for decades, data applications are now important in how many major businesses run; Because any loss of data, outage, or degradation can quickly result in loss of revenue and customers.

Without data observability, data dealers must continually react to data quality issues and entanglements as they go about using the data. A better solution is to proactively identify the problems and fix the root causes.

How does trust affect data?

Ray: Often, problems are discovered by stakeholders such as executives who do not trust their often broken dashboards. Or users get confusing results from in-product machine learning models. Data engineers can better get ahead of problems and prevent business impact if they are alerted enough.

How does this concept differ from similar sounding technologies like Integrated Data Management?

Ray: Data observability is a core function within data operations (think: data management). Many customers look for best-of-breed solutions for each task within data operations. This is why technologies like Snowflake, FiveTran, Airflow and DBT are exploding in popularity. Each is considered an important part of the “modern data stack” rather than a one-size-fits-none solution.

Data Overview, Data SLA, ETL [extract, transform, load] Code version control, data pipeline testing, and other techniques must be used to keep modern data pipelines working smoothly. Just like how high-performance software engineers and DevOps teams use their collaborative technologies.

What role do data pipelines and dataops play with data visibility?

Ray: Data Observability is closely related to the emerging practice of DataOps and Data Reliability Engineering. DataOps refers to the broad set of operational challenges that data platform owners will face. Data Reliability Engineering is a part, but only part, of Data Ops, just as Site Reliability Engineering is related but does not include all DevOps.

Data security can benefit from data observation, as it can be used to identify unexpected changes in query volume on different tables or changes in the behavior of ETL pipelines. However, data observation by itself will not be a complete data protection solution.

What challenges does this technology face?

Ray: These challenges include issues such as data discovery and governance, cost tracking and management, and access control. It also includes how to handle queries, dashboards, and the growing number of ML features and models.

Reliability and uptime are certainly challenges many DevOps teams are responsible for. But they are also often charged for other aspects such as developer velocity and security reasons. Within these two areas, data overview enables data teams to know whether their data and data pipeline are error free.

What are the challenges of implementing and maintaining data observability technology?

Ray: Effective data observability systems must be integrated into the workflows of the data team. This enables them to continuously respond to data issues and focus on growing their data platform rather than putting out data fires. However, poorly tuned data observability systems can result in a flood of false positives.

An effective data system should perform more maintenance than just testing for data quality issues by automatically adapting to changes in the business. A poorly optimized data observation system, however, may not be accurate for changes in business or more accurate for changes in business that require manual tuning, which can be time-consuming.

Data observability can also be taxing on a data warehouse if not optimized properly. Bigeye teams have experience in optimizing large-scale data observation capability to ensure that the platform does not impact data warehouse performance.