Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Daily Tech Digest - August 10, 2018

Addressing the AI Engineering Gap with Technology


Headline breakthroughs in AI have come fast and furious in recent years, fuelled by the rapid maturing of techniques using deep Learning, the success of GPUs at accelerating these compute-hungry tasks, and the availability of open-source libraries like TensorFlow, Caffe, Theano and PyTorch. This has accelerated innovation and experimentation, leading to impressive new products and services from large tech vendors like Google, Facebook, Apple, Microsoft, Uber and Tesla. However, I predict that these emerging AI technologies will be very slow to penetrate other industries. A handful of massive consumer tech companies already have the infrastructure in place to make use of the mountains of data they have access to, but the fact is that most other organisations don’t – and won’t for a while yet. There are two core hurdles to widespread adoption of AI: engineering big data management, and engineering AI pipelines. ... AI engineering competency is the next hurdle – and it’s likely to be many years yet before it becomes widespread across industries beyond the tech giants.


Enterprises should be able to sell their excess internet capacity


The idea is that those with excess data capacity, such as a well-provisioned office or data center, which may not be using all of its throughput capacity all of the time — such as during the weekend — allocates that spare bandwidth to Dove’s network. Passing-by data-users, such as Internet of Things-based sensors or an individual going about business, would then grab the data it, he, or she needs; payment is then handled seamlessly through blockchain smart contracts. “The Dove application will find the closest Dove-powered hotspot or peer node, negotiate the package deal, and connect automatically,” the company says in a white paper. Dove Network says it intends to supply a 500-yard-plus-range, blockchain-based wireless router to vendors. It’s also talking about longer-range access points in the future. Both solutions will allow relatively few organizations to sign up, yet still blanket urban areas with hotspots, it says. Dove Network further says on its website that it believes internet infrastructure is broken. It reckons half of the world is not connected to the internet, yet 35 percent of paid-for data is never used.


Can SNMP (Still) Be Used to Detect DDoS Attacks?


Polling from the cloud every five seconds might not be the way one wants to build its attack detection. And even if one does, it is limited to detecting attacks where the smallest burst is no longer than 10 seconds. What to do when the burst is six seconds, or less? The SNMP polling method simply does not scale for the detection of burst attacks and we need to move away from pulling analytics to real-time, event-based methods. On-box RMON rules with threshold detection, generating SNMP traps, provides one alternative without introducing new technologies or protocols. However, what is possible in terms of detections and triggers for SNMP traps will depend on the capabilities of your device. That said, most network equipment manufacturers provide performance management and streaming analytics that by far exceed the possibilities of SNMP. Now would be a good time to look at those alternatives and implement an on- or off-box automation for attack detection and trigger traffic redirection through API calls to the cloud service.


Hairy artificial skin gives robots a sense of touch


The smart skin includes nanowire sensors made from zinc oxide (ZnO). They are much thinner than human hair (0.2 microns, while hair is around 40 microns), and when they brush against something, they can sense temperature changes and surface variations. These nanowires are covered in a protective coating that makes them resistant to chemicals, extreme temperatures, moisture, and shock, so they can be used in harsh environments. The nanowires and protective coating are bundled together into one sheet of pressure sensing "skin" that can be draped over a robot, so existing robots such as a fleet of industrial arms at a manufacturing plant could be retrofitted with a new sense of touch. While the image of hairy robots is endearing, the skin actually just looks like a sheet of plastic with patches of sensors. The "hairs" are so small that you can't feel them, and they can only be seen under a microscope. The researchers describe their smart skin in a paper that published in IEEE Sensors Journal in 2015, and they have now received a patent for their technology. We asked the lead researcher Zeynep Çelik-Butler how this stands out from other smart skin technologies.


Data veracity challenge puts spotlight on trust

This data veracity challenge is one that most businesses have yet to come to grips with. In our Technology Vision for Oracle 2018, 79 percent of the business executives we spoke with agreed that organizations are basing their most critical systems and strategies on data – yet many have not invested in the capabilities to verify the truth within it. If we’re to fully harness data for the full benefit to businesses and society, then this challenge needs to be addressed head on. In the past year the company unveiled its Autonomous Database, which further maintains data purity by – as the name implies – offering total automation and thereby vastly reducing human error. Steps like these are critical, as data services and websites rely on DaaS to properly analyze their data and provide holistic views of customers. To address the data veracity challenge, businesses should focus on three tenets to build confidence: 1) provenance, or verifying the history of data from its origin throughout its life cycle; 2) context, or considering the circumstances around its use; and 3) integrity, or securing and maintaining data.


Numerous OpenEMR Security Flaws Found; Most Patched


The OpenEMR community "is very thankful to Project Insecurity for their report, which led to an improvement in OpenEMR's security," Brady Miller, OpenEMR project administrator, tells ISMG. "Responsible security vulnerability reporting is an invaluable asset for OpenEMR and all open source projects. The OpenEMR community takes security seriously and considered this vulnerability report a high priority since one of the reported vulnerabilities did not require authentication," Miller says. "A patch was promptly released and announced to the community. Additionally, all downstream packages and cloud offerings were patched." So, what's been fixed? "The key vulnerability in this report is the patient portal authentication bypass, which essentially allows a bad actor to bypass authentication and gain access to OpenEMR - if the patient portal is turned on," Miller says. "All the other vulnerabilities require authentication." The patient portal authentication bypass, multiple instances of SQL injection, unrestricted file upload, remote code execution and arbitrary file actions vulnerabilities "were all fixed," he says.


What can the enterprise learn from the connected home?


The main driver for enterprise IoT is that the large volumes of data created by connected devices present a huge opportunity. By leveraging the power of analytics – either on a small scale or across large deployments – businesses can gain additional layers of insight into their operations and make improvements. This is exactly what the smart home enables. By using connected products to track energy usage, for example, consumers can learn where they are spending the most money and become more cost-efficient. However, from an enterprise perspective, the challenge comes in being able to efficiently manage and control hundreds or potentially thousands of smart devices. Simply keeping track of the vast swathes of data being generated from devices in a range of different locations and from an assortment of vendors, is already a serious issue and is likely to be the biggest IoT challenge IT departments will face in the future. What they don’t want is to have several platforms pulling in different data streams. Not only would this be hugely confusing to manage, the lack of coordination would create a fragmented picture of what is going on across the business.


How API-based integration dissolves SaaS connectivity limits


API integration supports multichannel experiences that improve customer engagement. An example is how integration helps businesses partner with other service providers to offer new capabilities. An example is an API model that makes Uber services available on a United Airlines application. APIs also spur revenue growth. For instance, a business's IP [intellectual property] that lies behind firewalls can be exposed as an API to create new revenue channels. Many new-age companies, such as Airbnb and Lyft, leverage the API model to deliver revenue. Traditional companies [in] manufacturing and other [industries] are really applying this to their domain. API-first design provides modernized back-end interfaces that speed integrations. Doing back-end integrations? You can run the APIs within the data center to integrate SaaS and on-premises applications. A good API, a well-designed API can actually reduce the cost of integration by 50%.


Serverless Still Requires Infrastructure Management


Even though the servers are gone from the serverless picture, this doesn’t mean you can forget about infrastructure configuration altogether. Rather than configuring compute instances and many network related resources, which was commonplace for the traditional IaaS stack, we now need to configure functions, storage buckets or/and tables, APIs, messaging queues/topics and many additional resources to keep everything secured and monitored. When it comes to infrastructure management, serverless architectures usually require more resources to be managed due to the fine-grained nature of serverless stacks. At the same time, without servers in sight, infrastructure configuration can be done as a single stage activity, in contrast with the need to manage IaaS infrastructure separately from the software artifacts running on different kinds of servers. Even with this somewhat simplified way of managing infrastructure resources one still needs to use specialised tools for defining and applying infrastructure stack configurations. Cloud platform providers offer their proprietary solutions in this area.


5 ways machine learning makes life harder for cybersecurity pros

Machine learning is a form of AI that interprets massive amounts of data, applying algorithms to the material, and making predictions off its observations. Common technologies that employ Machine Learning include facial recognition, speech recognition, translation services, and object recognition. Businesses typically use machine learning for locating and processing large data sets that no human could sort through in a timely manner, if at all. Major companies like Amazon, IBM, Google, and Microsoft use machine learning to improve business functionality. But some organizations are implementing machine learning for more a narrow purpose: Cybersecurity. While many assume machine learning makes cybersecurity professionals' lives much easier by better tracking security issues, that's not necessarily the case. Just like any new technology, machine learning still has its flaws—problems that turn the tech into more of a headache than a helping hand in the security space



Quote for the day:


"Making those around you feel invisible is the opposite of leadership." -- Margaret Heffernan




This post first appeared on Tech Bytes - Daily Digest, please read the originial post: here

Share the post

Daily Tech Digest - August 10, 2018

×

Subscribe to Tech Bytes - Daily Digest

Get updates delivered right to your inbox!

Thank you for your subscription

×