Welcome to the airport of the future, where your face is your passport
Biometrics aren't just being used at border control. Sydney Airport has announced it's teaming up with Qantas, Australia's largest airline, to use facial recognition to simplify the departure process. Under a new trial, passengers on select Qantas international flights can have their face and passport scanned at a kiosk when they check in. From then on, they won't need to present their passport to Qantas staff -- they'll be able to simply scan their face at a kiosk when they drop off luggage, enter the lounge and board their flight at the gate. Travellers will still need to go through regular airport security and official immigration processing, but all of their dealings with Qantas can be handled with facial recognition. "Your face will be your passport and your boarding pass at every step of the process," Geoff Culbert, Sydney Airport CEO, said of the new development.
Google just gave control over data center cooling to an AI
Now, Google says, it has effectively handed control to the algorithm, which is managing cooling at several of its data centers all by itself. “It’s the first time that an autonomous industrial control system will be deployed at this scale, to the best of our knowledge,” says Mustafa Suleyman, head of applied AI at DeepMind, the London-based artificial-intelligence company Google acquired in 2014. The project demonstrates the potential for artificial intelligence to manage infrastructure—and shows how advanced AI systems can work in collaboration with humans. Although the algorithm runs independently, a person manages it and can intervene if it seems to be doing something too risky. The algorithm exploits a technique known as reinforcement learning, which learns through trial and error. The same approach led to AlphaGo, the DeepMind program which vanquished human players of the board game Go
Overlook 5G security at your peril
Attacks can come in many different shapes and sizes; user malware, fraudulent calls, spam, viruses, data and identity theft, and denial of service, to name a few examples. The rise in security threats is partly due to the growing deployment of carrier Wi-Fi access infrastructures and small cells in public areas, offices and homes and will increase exponentially with M2M. Historically, carrier-grade telecom networks have had an excellent record for user and Network security; however, today’s communications infrastructure is far more vulnerable than its predecessors. And with advances in security threats constantly evolving, service providers must invest in the right tools to keep on top of the issue. These increasing security risks are due to the move to the IP-centric LTE architecture. The flatter architecture is what exposed the 4G networks, due to the fact there were fewer steps to the core network, and this will continue to be an issue with 5G networks.
Companies lack leadership capabilities for digital transformation projects
Yet, even after years of exponential growth in the digital and digital consulting arenas, new Capgemini research shows that the implementation of Digital Transformation projects is still lagging in its nascent stages. According to the responses of more than 1,300 business leaders from some 750 organisations, only a relatively small number of companies have the digital (39%) and managerial (35%) capacities needed to make their digital transformation successful. While the fact that these figures remain less than 50% is surprising, what is even more shocking is that, compared to exactly the same measurement six years ago, there has actually been a decline in the firms’ general readiness for digital transformation. Capgemini found that organisations today feel less equipped with the right leadership skills, at 45% in 2012 compared to 35% in 2018. According to Vincent Fokke, Chief Technology Officer at Capgemini in the Benelux, this is an important point to note.
AI and Robots: Not What You Think
Depending on what you read – and choose to believe about what you read – AI-driven robots are able to autonomously make decisions about what work gets done, how it gets done and who does it or there are decades of work yet to be done before we see a material impact. Personally, I think we’re somewhere in the middle, as manufacturers – pragmatists that they are – design and implement manufacturing strategies in a very deliberate way to achieve business requirements and then focus ongoing efforts to make key processes better and better. And I think that collaborative robots (cobots) will play a larger and larger role in accelerating progress. The AI that cobots possess makes them so much more than just machines for dirty, dull and dangerous work. So let the world watch and wait for artificial intelligence that will enable wholesale change in how we drive, care for our aged, teach our children and more. Manufacturers don’t have to wait for artificial intelligence-driven robots to help them make their operations better.
Serverless vs. Containers
Debate about serverless vs. containers often starts with control, or the lack thereof in the case of serverless. This is not new. In fact, I clearly remember the same debates around control when AWS was starting to gain traction way back in 2009. Now 10 years later, the dust has settled on that original debate but we have failed to learn our lesson. It's human nature to want control, but how much are you willing to pay for it? Do you know the total cost of ownership (TCO) you will take on. The ability to control your own infrastructure comes with a lot of responsibilities. To take on these responsibilities, you need to have the relevant skill sets in your organization. That means salaries (easily the biggest expense in most organizations), agency fees, and taking time away from your engineers and managers for recruitment and onboarding. Given the TCO involved, the goal of having that control has to be to optimize for something (for example, to achieve predictable performance for a business critical workflow), and not having control for its own sake.
The Evolution of Internet of Things: New Business Models Need Interoperability
The predicted rate of connected devices growth that is often cited by Gartner, Deloitte and others, is based upon the proliferation in data and the effect this rate of growth will have on businesses and the number of new businesses that will be created. But, if the current trend of single-use case IoT solutions continues to be siloed, these predictions for connected devices growth may not be realised. Open APIs between product and service providers are the key technology towards resolving this issue. ... That is simply too expensive and time consuming, particularly for smaller businesses, to maintain. Simply managing the connection between one partner will result in maintenance costs that, as a business with tight margins, might not be viable to continue. Gartner predicted 75 per cent of IoT projects will take twice the time allocated to be accomplished, because of the increasing complexity associated with developing this connectivity. So what is the solution?
Streamlining Data Science and Analytics Workflows for Maximum ROI
Despite the multitude of tasks associated with the data science position, its basic workflow (in terms of analytics) is readily codified into three steps. The first is data preparation or data wrangling; where the data scientist starts with raw data and “just tries to make sense of it before they’re doing anything real with it,” Mintz explains. “Then there’s the actual model building when they’re building a machine learning model. Assuming they find something valuable, there’s getting that insight back into the hands of the people who can use it to make the business run better.” Typically, data scientists approach building a new analytics solution for a specific business problem by accessing raw data from what might be a plethora of sources. Next, they engage in a lengthy process to prepare the data for consumption. “So much time and energy goes into that,” says Mintz. “You look at the surveys of data scientists and they say 70-80% of my time goes to data cleaning.”
Stranded and in Need of Rescue: Your Enterprise Data
In today’s enterprise, it is still very common for data to be stored disparately in any number of locations and systems. Getting to a single version of the truth is virtually impossible to achieve with siloed data and different areas of the business act and operate in different ways, depending upon which version of the truth they are subscribed or have access to. In fact, in an upcoming report released later this month, IDC name data siloes as the number one challenge for Digital Transformation (DX). Because isolated data leads inexorably to isolated working practices and those are the antithesis of an integrated strategy, which is what DX is all about. Integration. No wonder then, that figures such as those produced by Harvard Business Review and Forbes show that nearly two thirds of DX initiatives are failing. Optimal use of information is a Critical Success Factor for today’s enterprise.
Network technologies are changing faster than we can manage them
Data breach and user experience are the two biggest network worries. About 33 percent of network professionals said a data breach worries them the most about their network. Given the almost daily data breaches, who can blame them? In an ideal world, network managers would like to see tools that combine network and security management. However, only about 40 percent of respondents said their organization was using the same stack of tools to manage both network performance and security. But network pros are also being overwhelmed by the huge proliferation of cloud and network management tools. Many organizations are trying combinations of tools to manage the challenge. Network traffic analytics appears to be the most commonly used, with just over 28 percent of network professionals using it to manage their network challenge.
Quote for the day:
"If you don’t have some self doubts and fears when you pursue a dream, then you haven’t dreamed big enough." -- Joe Vitale