Akamai Announces Content Protector to Stop Scraping Attacks

Akamai Technologies, Inc., the cloud company that powers and protects life online, announced the availability of Content Protector, a product that stops scraping attacks without blocking the good traffic that companies need to enhance their business.

Scraper bots are a critical and often productive part of the commerce ecosystem. These bots search for new content, highlight products in comparison sites, and gather updated product information to share with customers. Unfortunately, scrapers also get used for harmful purposes such as competitive undercutting, surveillance before inventory hoarding attacks, and counterfeiting goods and websites. Scrapers also ping sites 24/7 unless stopped — so they can degrade site performance, which in turn frustrates consumers and causes them to abandon their visits. In addition, scrapers have become much more evasive and sophisticated over the past few years.

Also Read: FADEL Releases Brand Vision 5.1 Delivering Security, Compliance, and Process Optimization for…

Akamai Content Protector helps detect and mitigate evasive scrapers that steal content for malicious purposes. It enables improved site performance, an enhanced user experience and protection of intellectual property while facilitating significantly better detections and fewer false negatives without increasing the rate of false positives. The product is designed for companies that need to protect their reputation and revenue potential. It offers tailored detections that include:

  • Protocol-level assessment: Protocol fingerprinting checks how visitors connect to your site to ensure they’re legitimate. It evaluates how the client establishes the connection with the server at the different layers of the Open Systems Interconnection (OSI) model — verifying that the parameters negotiated align with the one expected from the most common web browsers and mobile applications.
  • Application-level assessment: Evaluates if the client can run some business logic written in JavaScript. When the client runs JavaScript, it collects the device and browser characteristics and user preferences. These various data points are compared and cross-checked against the protocol-level data to verify consistency.
  • User interaction: Analyzes user interactions to distinguish between human and bot traffic. It assesses how users interact with devices such as touch screens, keyboards, and mouse, identifying bots through their lack of interaction or abnormal usage patterns.
  • User behavior: Monitors visitor behavior on your website to identify unusual patterns indicative of bots.
  • Risk classification: Provides a deterministic and actionable low-, medium-, or high-risk classification of the traffic, based on the anomalies found during the evaluation.

“Content Protector is more than just a security tool; it’s a business enabler,” says Rupesh Chokshi, SVP and GM, Application Security at Akamai. “By safeguarding your digital assets from scraping threats, it prevents competitors from undercutting your offers, enhances site performance to keep customers engaged, and protects your brand from counterfeiters. Content Protector provides direct business value to grow your digital business with confidence.”

SOURCE: PRNewswire

Akamaiartificial intelligencecontent managementMartech 360newsOpen Systems Interconnection