Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Daily Tech Digest - July 27, 2018

Mastering Spring framework 5, Part 1: Spring MVC

metal spring
Spring MVC is the Spring framework's traditional library for building Java web applications. It is one of the most popular web frameworks for building fully functional Java web applications and RESTful web services. In this tutorial, you'll get an overview of Spring MVC and learn how to build Java web applications using Spring Boot, Spring Initializr, and Thymeleaf. We'll fastrack our Spring MVC web application with the help of Spring Boot and Spring Initializr. Given input for the type of application to be built, Spring Initializr uses the most common dependencies and defaults to setup and configure a basic Spring Boot application. You can also add custom dependencies and Spring Initializr will include and manage them, ensuring version compatibility with both third-party software and Spring. Spring Boot applications run standalone, without requiring you to provide a runtime environment. In this case, since we're building a web application, Spring Boot will automatically include and configure Tomcat as part of the app's runtime. We can also customize the app by adding an H2 database driver to our Maven POM file.



5 Keys to Creating a Data Driven Culture

With business reconstructing their entire model to accommodate the need for digital change, one does think about what is causing this disruption. The need for a digital change starts with data. Data has become the need of the hour, and to perfectly manage and extract it, organizations need to go where the customers are: digital. Data is being generated by customers in the digital world, and organizations are willing to incorporate this digital change in a bid to get hold of this data. IoT devices and smartphones are playing an important role in data generation, curating data important to all organizations. Customers are not the only ones generating this data. From smart city technologies such as connected cars, trains, and video surveillance, to businesses themselves, data is generated at a meteoric rate. The digital interactions that every business has with their customers is one of the major sources of data, and businesses often ponder how they could use these data sources to reach meaningful insights that help them in real time.


New NetSpectre Attack Can Steal CPU Secrets via Network Connections

NetSpectre
Although the attack is innovative, NetSpectre also has its downsides (or positive side, depending on what part of the academics/users barricade you are). The biggest is the attack's woefully slow exfiltration speed, which is 15 bits/hour for attacks carried out via a network connection and targeting data stored in the CPU's cache. Academics achieved higher exfiltration speeds —of up to 60 bits/hour— with a variation of NetSpectre that targeted data processed via a CPU's AVX2 module, specific to Intel CPUs. Nonetheless, both NetSpectre variations are too slow to be considered valuable for an attacker. This makes NetSpectre just a theoretical threat, and not something that users and companies should be planning for with immediate urgency. But as we've seen in the past with Rowhammer attacks, as academics spend more time probing a topic, exfiltration speeds will also eventually go up, while the technical limitations that prevent such attack from working will slowly go down and dissipate.


Embracing RPA - Opportunities and Challenges for Accountancy Profession

Once IT and security risks are satisfied with the IT architecture, the process is documented in detail and can be carried forward for implementation. Key sectors where RPA is playing a significant role in bringing in process efficiencies include highly regulated verticals such as, healthcare, banking, financial services and insurance. Other major sectors include telecommunications, utilities, mining, travel and retail. ... Business users of the organisation review the work of the robots and resolves any exception and escalates, if required, to identify stakeholder for resolution. In a long run, the bots can be self-learning to go the level of RPA for decision making. RPA is believed to revolutionize and redefine the way we will work and make us more smart and quick in processes, RPA as we see have commenced deployment in most large business and, will continue to grow and will adopt to be cognitive by next five years. Further, it is predicted by many that that is shall develop to machine learning platform probably by year 2025-2026


Containers Provide the Key to Simpler, Scalable, More Reliable App Development


Kubernetes originally came out of Google, and it’s basically an orchestration layer around containers. For example, if I’m writing a containerized application, I can run it on top of Kubernetes, and Kubernetes will handle a lot of the underlying infrastructure orchestration—specifically, things like scaling up to meet demand or scaling down when demand is light. If servers crash, it will spin up more. The application developer simply says, “Hey, here are my containers. This is what they look like. Run them,” and then Kubernetes manages and orchestrates all of the underlying capacity. Kubernetes works whether you’re developing an application for three people or a global enterprise. What you’re doing is applying good architectural structure around a large-scale application whether you need it or not. So, you’re getting inherent reliability and scaling abilities along with capabilities to address and handle failures. For example, let's say I deploy a cluster within an on-prem or cloud infrastructure region and it is spread across three different physical availability domains.


CCTV and the GDPR – an overview for small businesses

The GDPR requires data controllers and processors to implement “appropriate technical and organisational measures” to protect personal data. This entails an approach based on regular assessments to ensure that all risks are appropriately addressed. For instance, access to CCTV systems must be limited to authorised personnel, which is especially important where systems are connected to the Internet or footage is stored in the Cloud, and there is a greater risk of unauthorised access. Surveillance systems should also incorporate privacy-by-design features, including the ability to be switched on or off, and the option to switch off image or sound recordings independently where it would be excessive to capture both. CCTV equipment must also be of a sufficient quality and standard to achieve its stated purpose. The international standard for information security management, ISO 27001, is an excellent starting point for implementing the technical and organisational measures necessary under the GDPR.


Why a product team’s experimentation goes wrong

Why a product team’s experimentation goes wrong image
The only thing worse than not running experiments is running experiments that are misinterpreted. There are several ways in which companies misunderstand the statistics behind experiments. Firstly, companies are overly reactive to early returns. Early on during experiments there are few conversions and experiment results swing wildly. When teams “peak early” at results, they frequently overvalue the data and end experiments prematurely. It is very common for the direction of a metric to swing over the course of an experiment and teams that do not have the patience to wait are at the mercy of random chance. Secondly, stakeholders often create arbitrary pressures and deadlines to get answers early. In many business processes, management can improve productivity by introducing pressure and deadlines for teams. However, in the realm of science, this behaviour causes the opposite of the intended effect. Ordering teams to give results of by a certain date often causes teams to interpret insignificant data through gut feel. While these decisions can feel scientific to executives, they are all too often incorrect and give a false certainty about the wrong direction.


With Today’s Technology, Do You Really Need A Financial Advisor?

Finding an asset to invest in is one thing, but understanding how to implement it is another. Before investing in any funds, it is very important to study the historical data of the asset class. Sure, most of the time, past performance doesn’t necessarily correlate with future performance, but it is reasonable to think that some historical risk-reward relationships are likely to persist (i.e., long-term, stocks could be expected to outperform bonds, but with a higher degree of volatility). The financial advisor will look at all these and present you with an implementation plan that is likely to benefit you the most. While choosing assets to invest in, another aspect that clients usually overlook is taxes. If the future returns on an asset turn out to be average while the taxes on them are high, then the overall return for an investor would be negatively affected. This is why tax management is important, as tax-conscious financial planning and tax-efficient portfolio construction can lead to higher returns.


This company changes the DNA of investing — through machine learning

This company changes the DNA of investing — through machine learning
Simply put, a computer can be taught what `successful trading’ looks like, and combine such information from various users to build an investment portfolio that draws from their cumulative wisdom. It is no wonder, then, that financial giants such as JPMorgan Chase and Goldman Sachs are openly utilizing machine learning for their investing practices. After all, they have the resources and the data to make it work. However, this power is not reserved for these giant corporations. There are instances in which machine learning can benefit the ‘little guy’ as well. eToro’s declared mission is to disrupt the traditional financial industry and break down the barriers between private investors and professional-level practices. One such instance can be seen in eToro’s CopyFunds Investment Strategies, which are managed thematic portfolios, powered by advanced machine learning algorithms. This means private individuals now have access to technology previously reserved for giant corporations.


The Commercial HPC Storage Checklist – Item 2 – Start Small, Scale Large

As the HPC project moves into full-scale production, the organization then faces the opposite problem, making sure the system can scale large enough to continue to meet the capacity demands of the project. Scaling out requires meeting several challenges. First, the system has to integrate new nodes into the cluster successfully, since additional nodes provide the needed capacity and performance. However, adding another node is not always as straightforward as it should be. Many systems require adding the node manually as well as manually rebalancing data from other nodes to the new node. The Commercial HPC storage customer should look for an HPC storage system that can grow with them as their needs evolve. It should start small during the initial phases of development but scale large as the environment moves into production. The system should make the process of adding nodes as simple as possible; automatically finding available nodes, adding them to the cluster automatically and automatically rebalancing cluster data without impacting storage performance.



Quote for the day:


"Ever tried. Ever failed. No matter. Try again. Fail again. Fail better." -- Samuel Beckett




This post first appeared on Tech Bytes - Daily Digest, please read the originial post: here

Share the post

Daily Tech Digest - July 27, 2018

×

Subscribe to Tech Bytes - Daily Digest

Get updates delivered right to your inbox!

Thank you for your subscription

×