The data lake may be a fashionable concept, but it is also one born of necessity. The traditional data warehouse approach cannot handle the vast quantities of data now being produced.So, rather than trying to transform all data before loading (like the data warehouse), the principle of the data lake is to reverse these two steps.
In the beginning, applications ran on one operating system on one physical server, period. Then came hypervisors and virtual machines. A VM offers a full operating system and associated resources, and several VMs can run in the same physical machine. However, VMs must still be started or “spun up” in a server. They can consume […]
The ideas behind hybridization and hyper-scaling in the cloud are simple. Hybridization is mixing public and private cloud computing. Hyper-scaling is the ability to grow or shrink resources to meet your needs. So far, so good, although things can get more complicated when it comes to putting these notions into practice. The trick as with any information technology is to ensure that the benefits outweigh the expenses.
Big data goes beyond precisely defined data sets. Instead of purely employee-entered data, for example, big data extends to or constituent comments on social media, third-party economic forecasts, machine sensor outputs and any other data source that has a bearing on questions being asked or issues to be resolved. Analyzing big data to reveal patterns, trends and likely outcomes can take organizations far beyond the limits of “normal” data processing to improve decision making and efficiencies.
“Do more with less – and do it better, too!” Federal CIOs are all too familiar with this edict. IT efficiency and cost reduction are both uppermost in importance these days and CIOs must figure out how to achieve them. However, IT efficiency isn’t just about getting better productivity from resources. It’s about making sure those improved returns contribute to the mission success of the organizations, too. Controls for IT efficiency and cost reduction must be designed to work together from the outset.
The need for proper IT asset management has steadily increased, therefore increasing the need for technologies like configuration management database (CMDB). Fusion PPT takes a closer look at CMDB tools and strategies to choose the best one to meet your company needs.
Servers and applications can generate a wealth of performance feedback and log data. This data can be used at different stages of a DevOps release cycle, helping to pinpoint the origins of different issues and indicate opportunities for improvement. With the right tool for collecting data from different locations, collating it, and making sense of […]
In DevOps, the goal is not only to excel in producing applications. It is also to control all of the technology infrastructure through code, including integration testing, deployment server configuration, monitoring, and reporting. Once the code is in place for any of these items, it can be automatically triggered and executed. This code-based approach is […]
The closer a team works together on producing and deploying a software application, the better the chances of a timely, high quality result. DevOps is the way to achieve this, through tight collaboration between developers and operations staff, and automation of the different release steps. To move ahead in creating code, developers also need […]
Software ready for the real world still has a virtual last mile to go. After development, testing, continuous integration, and quality assurance, for example, it must be installed on a production server – or perhaps hundreds or even thousands of production servers. When these servers are largely identical, automation of the deployment and subsequent maintenance […]