Saturday, February 15, 2020

Data Warehouse, Data Mart and Business Intelligence Essay

Data Warehouse, Data Mart and Business Intelligence - Essay Example Data Warehouses, Data Marts and Databases A data warehouse refers to a data storage location used to secure, archive, and analyze data. It comprises of many integrated databases in an organization. Data stored in a data warehouse must be easily accessible to facilitate the daily operations of an organization. There are several types of data ware houses. There are offline operational data warehouses where data is copied from real time data networks and stored offline. Offline data warehouses store integrated data that is frequently updated and can be easily accessed. Real-time data warehouses are updated whenever new data comes in, for example in point of sale systems. Integrated data warehouses can be accessed by other systems (Jensen, Pedersen, and Thomsen, 2010). Data marts refer to smaller data warehouses covering a specific department or subject. They differ from data warehouses in that they are less complex, and are easier to develop and maintain. Data warehouses also focus on many subject areas and collect their data from various sources while data marts deal with one subject and collect data from few sources. There are dependent and independent data marts. Dependent data marts source their data from a functional central data warehouse while independent data marts get data from external sources. A data mart can be a small division of a data warehouse (Jensen, Pedersen, and Thomsen, 2010).... Databases contain records of data that can be easily accessed. While databases are designed to record and store data, data warehouses are designed to respond to critical business queries. All data warehouses are databases but few databases can be considered to be data warehouses. Databases are usually online transaction processing systems for recording transactions while data warehouses are online analytical processing systems for querying and analyzing data (Jensen, Pedersen, and Thomsen, 2010). Data Warehouse Architectures and Tools Data warehouses are developed using several steps including data collection, data cleansing, data aggregation, and analysis and presentation. Data collection involves identifying the suitable data for the warehouse and where it can be sourced from. In data cleansing and transformation, the collected data is restructured to make it usable for reporting, querying, and analysis. Data aggregation and analysis involves the use of query tools to transfer data from the central data warehouse and processing it to produce the required results. Presentation involves giving end results to the users in form of text, charts or tables (Barry, 2003). There are various data warehouse architectures varying from one organization to another depending on their data. These architectures include independent data marts, hub-and-spoke, federated, centralized data warehouse and data mart bus architecture that has linked dimensional data marts. Independent data marts architecture involves developing autonomous marts with different data definitions, measures and dimensions. Data bus mart with linked dimensional data marts architecture is designed to meet the needs of a specific business process. It involves the development of one

Sunday, February 2, 2020

Anomaly Detection Scheme for Prevention of Online Attacks Dissertation

Anomaly Detection Scheme for Prevention of Online Attacks - Dissertation Example The time parameter reflects any deviation from the normal (duration taken) in disseminating information and receiving of the feedback. The efficiency of communication is therefore slowed down and this cripples the activities of an institution. Hacking of the internet system distorts the original information that was fed and may bring about a jam. All these are prevented by use of highly advanced and sophisticated modern devices that quickly sense and produce signals to notify the comptroller (Chiang, 2004). Data analysis must be undertaken to confirm and ensure only the vital information is online and accessed by the target population. The systems are made in a way that they are able to identify the geographical location of an attacker who can then be easily trailed, and legal action may be taken. The coming attacks may also be blocked by an automated program in the system. Updating should always be done to facilitate prompt detection of attacks. This ensures the system remains at pa ce with any new technological changes. When all security measures are considered, the privacy of an institution remains secured. They remain at the disposal of the authorized authorities. Transmission of information must be sufficient at the shortest time possible. According to Chiang (2004), visualization of system level is done to integrate technology with the systems hardware, software or both. This ensures protection by offering an opportunity to study and analyze visual patterns that indicate any possible attack. Sensors are used to detect and send an alert signal inform of graphs on a screen. Multiple attacks are easily displayed and tracked down from their sources. This calls for a quick action in order to protect the data which includes resetting of the connection. All the forecasting and analysis is done in a data warehouse. This method ensures a quick and smooth action is taken to counteract any attack at the shortest time possible. Selection of heterogeneous threshold and conduct of a proper correlation analysis ensures systems are well set to accommodate large amounts of data and detect any slight attacks at any moment. A web of links is made that connects the major system to several others. A threshold value is also set which sounds an alarm when exceeded. The ease of detection of attacks becomes easier since either of them signals the main server (Chiang, 2004). An internal program is installed to ensure the system is able to detect any foreign data and differentiate self from non-self before sending a signal. The system becomes protected from collapsing and is encompassed with appropriate buffer zones to ensure the best possible results are obtained. Anomaly refers to deviation from the normal way in which information systems operate. This compromises the confidentiality and security of information contained within the system. Any delay in detection and streamlining back to normal may result to great negative impacts. Computers should therefore be protected from any form of attack by installation of a specialized and highly sensitive detector. This is called a detection scheme. It is backed up by additional security features which limit access to specific individuals and from a central point. The system is well cushioned and security guaranteed. Most institutions trust the viability of this security measure.