Tag

Cloud

Browsing

The toughest part of Identity and Access Management (IAM) technology is making it work with multi-vendor infrastructure and the growing number of applications that enterprises rely on to do business. This is primarily because the last-mile integration of applications and identity systems has traditionally been hard coded to allow the exchange of information about a user, their identity, roles and access permissions.

In the early days of identity, organizations were required to write bespoke code to integrate the app with the identity system. With the advent of software-as-a-service (SaaS) apps, this model was no longer viable because you don’t control the code of a SaaS application.

Instead, identity vendors began building and maintaining connectors to support individual apps as needed. This model worked because app vendors shared the connectors among all of their customers, who were happy that they no longer needed to write their own integration code.

This approach was initially scalable as there were only a dozen or so popular SaaS apps. However, as these numbers grew, it became problematic to maintain and test the App Connectors needed to keep them working.

The customers had no objections as the connectors were managed and delivered by the identity system providers. But increasingly, those connectors could not support apps that did not work with identity standards such as SAML or OpenID Connect (OIDC).

Identity Orchestration Recipe

In the cloud era, connectors are reaching their breaking point. Just as they were created to address an industry pain point, a new model designed to solve the connector impasse is called the Identity Orchestration Recipe.

This evolutionary approach replaces connectors by eliminating the need for app connectors in the first place. It securely addresses ‘last-mile’ integration with a universal session that works with any app running anywhere, eliminating the need to rewrite apps.


Identity orchestration enables customers to define use cases in terms of repeating patterns and templates called recipes, which shifts the focus of work from plumbing to innovation and allows businesses to focus on higher-level concerns such as customer experience. allows to do. This is possible because security is built-in to the plug-and-play integration model that doesn’t require custom code.

Some examples of uses include implementing personalized user journeys, app modernization, implementing passwordless authentication, supporting multiple identity providers (IDPs), and more. Each recipe can be applied to hundreds of apps.

Consider Lego building blocks. Anyone with a big enough box of Legos can build something amazing – provided they have the time and skill. For most people, though, it’s far easier to use a pre-designed kit to build a Star Wars Millennium Falcon. You get what you want faster and more easily if everything you need is right there, and you can assemble it by following simple instructions.

Identity orchestration recipes work in much the same fashion and are focused on achieving a desired result.

launch

Implementing orchestration recipes is as simple as browsing the ‘cookbook’ of use case recipes and integrating them into the fabric of your identity using a plug-and-play setup. Here are some easy steps to help you get started:

  • Create a list of apps, users, and identity systems: What materials do you have to work with? Start with a list of your system, then a list of your applications. Finally, make a list of your users: are you talking about customers, employees, partners, or all of the above?
  • Connect Content: Once you have the systems, applications, and users buckets worked out, the recipe comes down to how you connect or integrate those three circles of users, apps, and systems (identity providers, authentication, and other tools).
  • Apply Recipes: like boiling an egg; It can be as simple or as complex as you want. Most recipes are implemented in hours or days rather than weeks or months.


best practices

Recipes don’t need to be complicated; Here are some best practices to keep in mind:

  • Focus on the use cases you want to orchestrate: Think about your business use cases and write them down. A whiteboard or a sheet of paper will do. Are you looking to modernize apps and identity? Do you need to roll out passwordless MFA? Do you want to streamline user sign-up and sign-on experiences?
  • Define the user journey you want for each recipe: The fastest way to create a recipe is to ask: “The users are trying to get something. What do we want to happen?” You can see that the flow of the orchestration starts to take shape.
  • Remember that the ingredients in the recipe are interchangeable: Don’t get stuck on how it will work with any particular component (IDP, authentication, app, etc.). Recipes allow you to swap out one technique for another; For example, if you need to convert a legacy SiteMinder system to Azure AD, just swap out the identity provider, and the rest of the user flow will continue to work.
  • Get Buy-in: Use recipes and their results to gain buy-in from business decision makers and stakeholders by demonstrating the results they can expect. This saves time and money as it is easier to display a recipe on a whiteboard than a software display. It’s also really easy to build and demonstrate a quick proof of concept and then scale it up to hundreds of apps once the business takes off.

final thoughts

Furthermore, recipes can also be adapted to the changing needs as the organization grows. If you have a specific access policy for your employees, you can apply the same recipe to all the apps they use without having to differentiate them. Apply the recipe to 700 applications, and you’re done; No need to make 700 connectors. Making modifications is as easy as replacing bourbon with whiskey in an Old Fashioned cocktail.

Like a Lego kit allowing you to arrive at your desired result faster and more efficiently, the Identity Orchestration recipe provides a holistic approach to solving complex IAM use case challenges.

Running a business without some cloud support is rare these days. Yet, when crafting a cloud strategy, companies seem to make some common mistakes. There are several of them here.

Making Your Cloud Strategy Only IT Strategy

Gartner said this week at its IT Infrastructure, Operations and Cloud Strategies conference that a successful cloud strategy requires support from IT outsiders.

“Business and IT leaders should avoid the mistake of building an IT-focused strategy and then trying to ‘sell’ it to the rest of the business,” Marco Minardi, vice president analyst at Gartner, said in a statement. “Business and IT must be equal partners in the definition of cloud strategy.”

“Technology for technology’s sake is generally not a good idea,” said David Smith, vice president analyst at Gartner.

“Anytime you do something, you want a clear vision of why you’re doing it, what the business reason is,” Smith told TechNewsWorld.

“People look at it and say, ‘It’s technology. Let the technologists deal with it,'” he continued. “What happens then is that people focus on the adoption phase – which is about how you do things and when – which is different from the strategy part, which focuses on whether you What are you doing and why are you doing it.”

Change in exit strategy

Organizations often do not have an exit strategy in place with a cloud provider because they do not envision leaving the cloud. Furthermore, formulating such a strategy can be difficult. “People don’t like the answers they’re going to get, so they avoid it,” Smith said.

During the early days of the cloud, vendor lock-in was a significant fear, but that’s less the case today, said Tracy Wu, an analyst at Forrester, a national market research company headquartered in Cambridge, Mass.

“Some companies will prefer to be locked down with a specific vendor to get to market sooner or take advantage of specific pricing or services,” Wu told TechNewsWorld.

Still, she said, “organizations should always think of a plan B whether it’s the cloud or some other option.”

“That being said,” he continued, “it’s rare to hear of companies that actually pull out of a specific cloud provider completely.”

Confusing cloud strategy planning with cloud implementation planning

Organizations should always have a cloud strategy plan before implementation or adoption. Strategic planning is created during a decision phase in which business and IT leaders determine the role of cloud computing in the organization. Next comes a cloud implementation plan, which puts the cloud strategy into effect.

“If you call something a strategy, and it’s really an adoption plan, you end up with hundreds of pages of details that aren’t of interest to business people, so you scare them away, Smith explained.

“A good cloud strategy should be a short and consumable document, consisting of 10 to 20 pages or slides,” Meinardi said.

Some areas commonly overlooked in cloud strategy planning, as identified by Wu, include key goals, targeted revenue targets, new revenue streams, and the new business or traction the organization wants to build using the cloud.

“Too often, companies get into a rush to adopt the cloud and only think about the implementation aspect without thinking of the higher goals or the larger strategy at hand,” he said.

Comparing a Cloud Strategy to Migrating Everything to the Cloud

Minardi explained that many business and IT leaders shy away from formulating a cloud strategy because it would mean they would be forced to use cloud computing for everything. “Organizations should keep an open mind and partner with a non-cloud technology expert, such as an enterprise architect, who can bring a comprehensive perspective to the definition of your cloud strategy,” he advised.

On the other hand, some organizations believe that moving to the cloud is an easy task.

“One of the biggest challenges companies face is they think they can take what’s going on and move it to the cloud,” said Jack E. Gold, founder and principal analyst at J.Gold Associates ” IT Consulting Company Northborough, Mass.

“To get the best benefits from a cloud implementation, you need to rethink your applications, solutions, architecture and strategy,” Gold told TechNewsWorld.

“They also don’t do a great job of deciding which apps should stay on-premises and which should move to cloud environments,” he added.

“There are a lot of apps that will never go to the cloud,” he continued. “They’ve been around for 10 years. They’re going to be around for another 10 years. Why bother?”

Outsourcing Evolution of Your Cloud Strategy

As tempting as it may be for other business and IT leaders to build a cloud strategy, Gartner doesn’t recommend it. Outsourcing is very important, it said.

“It makes sense to outsource during the adoption phase where you may need outside expertise,” Smith said. “What happens, though, is that it’s all too easy to put yourself in a position where you’re allowing your vendors to define your strategy.”

“If you want to go out and get help from someone who knows what they’re doing, that’s fine, but you have to look at what they’re doing,” Gold said. “You don’t want to just throw checks at the wall. You need to be involved in figuring out your strategy, even if someone else is helping you put it together.

Wu agreed. “I wouldn’t say that outsourcing strategy is a bad idea unless the entire strategy is being outsourced, with absolutely no direction from the company,” she said. “It’s really a big part of what leading global systems integrators do when they help design and implement cloud strategy.”

Comparing cloud strategy to ‘cloud first’

Gartner explained that the “cloud first” approach means that when someone wants to create or own new assets, the public cloud is the default place to do so.

“But cloud-first doesn’t mean cloud only,” Mainardi said. “If businesses and IT leaders adopt a cloud-first principle, their strategy must work on exceptions to the default option that will apply elsewhere besides the cloud.”

Wu notes that some of the assets best placed outside of the cloud are data with heavy residency requirements (data cannot leave a specific region or country), data that needs to be physically located outside of the location being processed for latency or performance reasons. Must be relatively close, and where data exits is very expensive, such as in big data applications and AI.

Believes it’s not too late to design a cloud strategy

Gartner argues that it is never too late to develop a cloud strategy. “If organizations drive cloud adoption without a strategy in place, it will ultimately lead to resistance from those who are not aligned on the key drivers and principles of the strategy,” Meinardi said. “As a result, this resistance will slow cloud adoption and potentially jeopardize the entire cloud project.”

Applying artificial intelligence to medical images can be beneficial to clinicians and patients, but developing the tools to do so can be challenging. Google announced on Tuesday that it is ready to take on that challenge with its new medical imaging suite.

“Google pioneered the use of AI and computer vision in Google Photos, Google Image Search, and Google Lens, and we are now making our imaging expertise, tools and technology available to healthcare and life science enterprises,” said Alisa Sou. Lynch, global lead of Google Cloud MedTech Strategy and Solutions, said in a statement.

Jeff Cribbs, Gartner’s vice president and distinguished analyst, explained that health care providers who are looking to AI for diagnostic imaging solutions are typically forced into one of two choices.

“They can purchase software from a device manufacturer, image store vendor or a third party, or they can build their own algorithms with industry agnostic image classification tools,” he told TechNewsWorld.

“With this release,” he continued, “Google is taking their low-code AI development tooling and adding substantial healthcare-specific acceleration.”

“This Google product provides a platform for AI developers and also facilitates image exchange,” said Ginny Torno, administrative director of innovation and IT clinical, assistant and research systems at Houston Methodist in Houston.

“It is not unique to this market, but can provide opportunities for interoperability that a smaller provider is not capable of,” she told TechNewsWorld.

strong component

According to Google, the medical imaging suite addresses some common pain points when developing AI and machine learning models. Components in the suite include:

  • Cloud Healthcare API, which allows easy and secure data exchange using DICOMweb, an international standard for imaging. API provides a fully managed, scalable, enterprise-grade development environment with automated DICOM de-detection. Imaging technology partners include NetApp for seamless on-premises cloud data management and cloud-native enterprise imaging PACS Change Healthcare in clinical use by radiologists.
  • AI-assisted annotation tools from Nvidia and Monae to automate the highly manual and repetitive task of labeling medical images, as well as native integration with any DICOMWeb viewer.
  • Access to BigQuery and Looker to view and search petabytes of imaging data to perform advanced analysis and create training datasets with zero operational overhead.
  • Using Vertex AI to accelerate the development of AI pipelines to build scalable machine learning models with up to 80% fewer lines of code required for custom modeling.
  • Flexible options for cloud, on-premises, or edge deployment to allow organizations to meet diverse sovereignty, data security, and privacy needs – while providing centralized management and policy enforcement with Google Distributed Cloud, enabled by Anthos.

full deck of tech

“One key difference to the medical imaging suite is that we are offering a comprehensive suite of technologies that support the process of delivering AI from start to finish,” Lynch told TechNewsWorld.

The suite offers everything from imaging data ingestion and storage to AI-assisted annotation tools to flexible model deployment options on the edge or in the cloud, she explained.

“We are providing solutions that will make this process easier and more efficient for health care organizations,” she said.

Lynch said the suite takes an open, standardized approach to medical imaging.

“Our integrated Google Cloud services work with a DICOM-standard approach, allowing customers to seamlessly leverage Vertex AI for machine learning and BigQuery for data discovery and analytics,” he added.

“By building everything around this standardized approach, we’re making it easier for organizations to manage their data and make it useful.”

image classification solution

The increasing use of medical imaging, coupled with manpower issues, has made the field ready for solutions based on artificial intelligence and machine learning.

Torno said, “As imaging systems get faster, offering higher resolution and capabilities like functional MRI, it is harder for the infrastructure to maintain those systems and, ideally, stay ahead of what is needed.” “

“In addition, there is a reduction in the radiology workforce that complicates the personnel side of the workload,” she said.

Google Cloud Medical Imaging Suite

Google Cloud aims to make health care imaging data more accessible, interoperable and useful with its medical imaging suite (Image Credit: Google)


She explained that AI can identify issues found in an image from a learned set of images. “It may recommend a diagnosis that then only needs interpretation and confirmation,” she said.

“If the image detects a potentially life-threatening situation, it can also project the images to the top of a task queue,” she continued. “AI can also streamline workflows by reading images.”

Machine learning does for medical imaging what it did for facial recognition and image-based search. “Instead of identifying a dog, Frisbee or chair in a photograph, AI is identifying the extent of a tumor, bone fracture or lung lesion in a diagnostic image,” Cribbs explained.

tools, not substitutes

Michael Arrigo, managing partner of No World Borders, a national network of expert witnesses on health care issues in Newport Beach, Calif., agreed that AI could help some overworked radiologists, but only if it be reliable.

“Data should be structured in ways that are usable and consumable by AI,” he told TechNewsWorld. “AI doesn’t work well with highly variable unstructured data in unpredictable formats.”

Torno said that many studies around AI accuracy have been done and will be done further.

“While there are examples of AI being ‘just as good’ as a human didn’t have, or being ‘just as good’ as a human being, there are also examples where an AI misses something important, or isn’t sure.” That’s what to interpret because there may be many problems with the patient,” she observed.

“AI should be seen as an efficiency tool to accelerate image interpretation and assist in emergent cases, but should not completely replace the human element,” she said.

large splash capacity

With its resources, Google can have a significant impact on the medical imaging market. “Having a major player like Google in this area could facilitate synergy with other Google products already in place in healthcare organizations, potentially enabling more seamless connectivity to other systems,” Torno said.

“If Google focuses on this market segment, they have the resources to make a splash,” she continued. “There are already many players in this area. It will be interesting to see how this product can take advantage of other Google functionality and pipelines and become a differentiator.”

Lynch pointed out that with the launch of the medical imaging suite, Google hopes to help accelerate the development and adoption of AI for imaging by the health care industry.

“AI has the potential to help reduce the burden for health care workers and improve and even save people’s lives,” she said.

“By offering our imaging tools, products and expertise to healthcare organizations, we are confident that the market and patients will benefit,” he added.

The cloud gaming market appears to be poised for some significant growth, though it will be tough for new players to enter the scene.

In her newsletter published Tuesday, consumer technology guru Elizabeth Parks said the cloud gaming market is at an inflection point as heavyweights in the industry continue their involvement in it, and the popularity of gaming in consumer homes grows.

By 2021, 75% of leading US broadband households report playing video games for at least an hour a week, and 30% of those households admit to subscribing to or testing out a free or paid gaming service, according to Parks . Who is the President and CMO of Parks Associates in Edison, Texas.

“Cloud gaming services provide a new opportunity to serve the gaming market and capture the consumer segment without gaming consoles or PC gaming hardware,” she wrote.

“Continuing advances in technology, growing expectations for entertainment consumption to be cross-platform, and the potential for cloud gaming inclusion in ecosystem strategies make this an interesting market to look forward to,” she said.

some new entrants

However, Parks predicted that there would be few new entrants to the market. He said that setting up and operating a cloud gaming service is extremely costly and challenging.

He continued, the most important requirement is performance-competitive cloud infrastructure. It is expected that if there are new entrants, given the status of existing competitors, it will be a party that is willing to employ the cloud resources of one of the existing competitors, or that they already have sufficient cloud computing. is infrastructure.

One place a new player can get the infrastructure is what it needs, said Ross Rubin, principal analyst at Reticle Research, a consumer technology consulting firm in New York City. “Google’s decision to focus on white label offerings indicates that it thinks it has better prospects in partnership than going as a first-party service alone,” he told TechNewsWorld.

The window for newcomers isn’t closed, but it can be narrow, he said. “It’s still a bullish market,” he said. “In contrast to the relatively expensive subscription end, there are more opportunities at the low-cost, ad-driven end of the market.”

battling established brands

Mark N. Venna, president and principal analyst at SmartTech Research in San Jose, Calif., agreed that conditions are becoming tougher for newcomers to the market.

“For companies that do not have a history in the gaming space, it is difficult to be seen as credible, as many established players have strong brand reputations around gaming, especially from the standpoint of a legacy gaming title, ‘ he told TechNewsWorld.

“Both Microsoft and Sony really captured the market a few years ago by grabbing some of the more prestigious gaming studios with franchise titles under their belts, which shuts down potential new entrants,” he said.

“Netflix, for example, is clearly trying to foray into the cloud gaming space and is running into difficulty because they don’t have well-known titles in their gaming arsenal and, more importantly, they are being used by consumers for gaming. Not considered as destination,” he said.

Established players can also trade losses for market share. “Microsoft has focused on using its cloud service as a losing leader. Most companies can’t afford to do that,” David Cole, an analyst at DFC Intelligence, a market research firm in San Diego, told TechNewsWorld. told.

Entering the gaming market is usually a daunting proposition to begin with, and doing it on the cloud poses additional hurdles, maintained Michael Inoue, a principal analyst at ABI Research, a global technology intelligence company.

“A new cloud gaming service will have a competitive disadvantage in most cases when it comes to game libraries,” he told TechNewsWorld. “Publishers aren’t ready to put their games on every cloud gaming service.”

“In some cases,” he continued, “publishers may push their own platform, enter into pre-existing deals with other cloud gaming services, or simply not agree to the business model.”

cross-platform demand

Still, Inouye said the market is huge and there are opportunities available for new players, especially in mobile gaming.

“Mobile-based cloud gaming, at least for premium services, can be challenging in many cases due to competition with free-to-play,” he said, “but may find success in the Asia-Pacific region because there “K gamers have shown willingness to pay for mobile game-based content, although revenue per player is low.”

Parks also predicted that consumer desire for aggregation in the video streaming market would extend to cloud gaming. Cloud gaming service customers can respond to marketing campaigns focusing on the simplicity of a single point of subscription, purchase, billing and consumption — one that allows them to play across platforms, she wrote.

Along with increasing the appeal of the services to consumers, he said, this aggregation approach potentially generates more revenue for game developers by increasing their reach and making it convenient for consumers to subscribe to their content services.

“More consumers are seeking cross-platform gaming experiences so that they can experience and participate in gaming, regardless of the device they are using – console, smartphone, tablet, PC, or even… That Chrome laptop,” explained Vena.

“Gaming has now become a multi-platform phenomenon and gamers do not want to be affected by gaming on a single device or OS platform,” he continued. “This is a result of the multi-device world we live in now, which is only to grow in importance as 5G connectivity becomes more widespread.”

thrifty gamers

Inoue agreed that there is a growing demand for cross-platform titles as a whole, and gamers especially appreciate it when buying games cross-platform — meaning, if you buy a game for a console, you have There’s access to the PC version – but gamers can be frugal, too.

“At the end of the day consumers will always welcome the opportunity to play their games on more platforms, but it’s not like they have to pay for every copy or settle on all platforms to get that capability,” he said. .

“Gamers who are willing to upgrade their hardware will not accept poor PC or console performance just to gain access to content on all three platforms for the same price,” he concluded.