Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Daily Tech Digest - July 28, 2023

Cyber criminals pivot away from ransomware encryption

“Data theft extortion is not a new phenomenon, but the number of incidents this quarter suggests that financially motivated threat actors are increasingly seeing this as a viable means of receiving a final payout,” wrote report author Nicole Hoffman. “Carrying out ransomware attacks is likely becoming more challenging due to global law enforcement and industry disruption efforts, as well as the implementation of defences such as increased behavioural detection capabilities and endpoint detection and response (EDR) solutions,” she said. In the case of Clop’s attacks, Hoffman observed that it was “highly unusual” for a ransomware group to so consistently exploit zero-days given the sheer time, effort and resourcing needed to develop exploits. She suggested this meant that Clop likely has a level of sophistication and funding that is matched only by state-backed advanced persistent threat actors. Given Clop’s incorporation of zero-days in MFT products into its playbook, and its rampant success in doing so 


Get the best value from your data by reducing risk and building trust

Data risk is potentially detrimental to the business due to data mismanagement, inadequate data governance, and poor data Security. Data risk that isn’t recognized and mitigated can often result in a costly security breach. To improve security posture, enterprises need to have an effective strategy for managing data, ensure data protection is compliant with regulations and look for solutions that provide access controls, end-to-end encryption, and zero-trust access, for example. Assessing data risk is not a tick-box exercise. The attack landscape is constantly changing, and enterprises must assess their data risk regularly to evaluate their security and privacy best practices. Data subject access requests are when an individual submits an inquiry asking how their personal data is harvested, stored, and used. It is a requirement of several data privacy regulations, including GDPR. It is recommended that enterprises automate these data subject requests to make them easier to track, preserve data integrity, and are handled swiftly to avoid penalties.


Why Developers Need Their Own Observability

The goal of operators’ and site reliability engineers’ observability efforts are straightforward: Aggregate logs and other telemetry, detect threats, monitor application and infrastructure performance, detect anomalies in behavior, prioritize those anomalies, identify their root causes and route discovered problems to their underlying owner. Basically, operators want to keep everything up and running — an important goal but not one that developers may share. Developers require observability as well, but for different reasons. Today’s developers are responsible for the success of the code they deploy. As a result, they need ongoing visibility into how the code they’re working on will behave in production. Unlike operations-focused observability tooling, developer-focused observability focuses on issues that matter to developers, like document object model (DOM) events, API behavior, detecting bad code patterns and smells, identifying problematic lines of code and test coverage. Observability, therefore, means something different to developers than operators, because developers want to look at application telemetry data in different ways to help them solve code-related problems.


Understanding the value of holistic data management

Data holds valuable insights into customer behaviour, preferences and needs. Holistic management of data enables organisations to consolidate and analyse their customers’ data from multiple sources, leading to a comprehensive understanding of their target audience. This knowledge allows companies to tailor their products, services and marketing efforts to better meet customer expectations, which can result in improved customer satisfaction and loyalty. Organisations can in some on-market tools draw relationships between their customers to see the physical relationships. Establishing customer relationships can be very beneficial, especially for target marketing. To demonstrate this point, for example, an e-mail arrives in your inbox shortly before your anniversary date, suggesting a specifically tailor-made gift for your partner. It is extremely important for an organisation to have a competitive-edge and to stay relevant. Data that is not holistically managed will slow down the organisation's ability to make timely and informed decisions, hindering its ability to respond quickly to changing market dynamics and stay ahead of its competitors.


Why Today's CISOs Must Embrace Change

While this is a long-standing challenge, I've seen the tide turn over the past four or five years, especially when COVID happened. Just the nature of the event necessitated dramatic change in organizations. During the pandemic, CISOs who said "no, no, no," lost their place in the Organization, while those who said yes and embraced change were elevated. Today we're hitting an inflection point where organizations that embrace change will outpace the organizations that don't. Organizations that don't will become the low-hanging fruit for attackers. We need to adopt new tools and technologies while, at the same time, we help guide the business across the fast-evolving threat landscape. Speaking of new technologies, I heard someone say AI and tools won't replace humans, but the humans that leverage those tools will replace those that don't. I really like that — these tools become the "Iron Man" suit for all the folks out there who are trying to defend organizations proactively and reactively. Leveraging all those tools in combination with great intelligence, I think, enables organizations to outpace the organizations that are moving more slowly and many adversaries.


Navigating Digital Transformation While Cultivating a Security Culture

When it comes to security and digital transformation, one of the first things that comes to mind for Reynolds is the tech surface. “As you evolve and transition from legacy to new, both stay parallel running, right? Being able to manage the old but also integrate the new, but with new also comes more complexity, more security rules,” he says. “A good example is cloud security. While it’s great for onboarding and just getting stuff up and running, they do have this concept of shared security where they manage infrastructure, they manage the storage, but really, the IAM, the access management, the network configuration, and ingress and egress traffic from the network are still your responsibility. And as you evolve to that and add more and more cloud providers, more integrations, it becomes much more complex.” “There’s also more data transference, so there are a lot of data privacy and compliance requirements there, especially as the world evolves with GDPR, which everyone hopefully by now knows.


Breach Roundup: Zenbleed Flaw Exposes AMD Ryzen CPUs

A critical vulnerability affecting AMD's Zen 2 processors, including popular CPUs such as the Ryzen 5 3600, was uncovered by Google security researcher Tavis Ormandy. Dubbed Zenbleed, the flaw allows attackers to steal sensitive data such as passwords and encryption keys without requiring physical access to the computer. Tracked as CVE-2023-20593, the vulnerability can be exploited remotely, making it a serious concern for cloud-hosted services. The vulnerability affects the entire Zen 2 product range, including AMD Ryzen and Ryzen Pro 3000/4000/5000/7020 series, and the EPYC "Rome" data center processors. Data can be transferred at a rate of 30 kilobits per core, per second, allowing information extraction from various software running on the system, including virtual machines and containers. Zenbleed operates without any special system calls or privileges, making detection challenging. While AMD released a microcode patch for second-generation Epyc 7002 processors, other CPU lines will have to wait until at least October 2023. 


The Role of Digital Twins in Unlocking the Cloud's Potential

A DT, in essence, is a high-fidelity virtual model designed to mirror an aspect of a physical entity accurately. Let’s imagine a piece of complex machinery in a factory. This machine is equipped with numerous sensors, each collecting data related to critical areas of functionality from temperature to mechanical stress, speed, and more. This vast array of data is then transmitted to the machine’s digital counterpart. With this rich set of data, the DT becomes more than just a static replica. It evolves into a dynamic model that can simulate the machinery’s operation under various conditions, study performance issues, and even suggest potential improvements. The ultimate goal of these simulations and studies is to generate valuable insights that can be applied to the original physical entity, enhancing its performance and longevity. The resulting architecture is a dual Cyber-Physical System with a constant flow of data that brings unique insights into the physical realm from the digital realm.


The power of process mining in Power Automate

Having tools that identify and optimize processes is an important foundation for any form of process automation, especially as we often must rely on manual walkthroughs. We need to be able to see how information and documents flow through a business in order to be able to identify places where systems can be improved. Maybe there’s an unnecessary approval step between data going into line-of-business applications and then being booked into a CRM tool, where it sits for several days. Modern process mining tools take advantage of the fact that much of the data in our businesses is already labeled. It’s tied to database tables or sourced from the line-of-business applications we have chosen to use as systems of record. We can use these systems to identify the data associated with, say, a contract, and where it needs to be used, as well as who needs to use it. With that data we can then identify the process flows associated with it, using performance indicators to identify inefficiencies, as well as where we can automate manual processes—for example, by surfacing approvals as adaptive cards in Microsoft Teams or in Outlook.


Data Program Disasters: Unveiling the Common Pitfalls

In the realm of data management, it’s tempting to be swayed by the enticing promises of new tools that offer lineage, provenance, cataloguing, observability, and more. However, beneath the glossy marketing exterior lies the lurking devil of hidden costs that can burn a hole in your wallet. Let’s consider an example: while you may have successfully negotiated a reduction in compute costs, you might have overlooked the expenses associated with data egress. This oversight could lead to long-term vendor lock-in or force you to spend the hard-earned savings secured through skilful negotiation on the data outflow. This is just one instance among many; there are live examples where organizations have chosen tools solely based on their features and figured lately that such tools needed to fully comply with the industry’s regulations or the country they operate in. In such cases, you’re left with two options: either wait for the vendor to become compliant, severely stifling your Go-To-Market strategy or supplement your setup with additional services, effectively negating your cost-saving efforts and bloating your architecture.



Quote for the day:

"It's very important in a leadership role not to place your ego at the foreground and not to judge everything in relationship to how your ego is fed." -- Ruth J. Simmons



This post first appeared on Tech Bytes - Daily Digest, please read the originial post: here

Share the post

Daily Tech Digest - July 28, 2023

×

Subscribe to Tech Bytes - Daily Digest

Get updates delivered right to your inbox!

Thank you for your subscription

×