Sunday, December 6, 2009

Emerging threats to business security

Traditionally business would consider threats the actions of other people like a robbery but now more than ever, businesses need to be concerned about the security of their networks. The number,variety and strength of the threats to business computers and network security have dramatically increased and businesses need to be prepared against all types of malware attacks.

Malware is one of the biggest threats to business computer users on the Internet today. It can hijack your browser, redirect your search attempts, serve up nasty pop-up ads, track what web sites you visit, and generally screw things up. Malware programs are usually poorly-programmed and can cause your computer to become unbearably slow and unstable in addition to all the other havoc they wreak. Many of them will reinstall themselves even after you think you have removed them, or hide themselves deep within Windows, making them very difficult to clean. There are different varieties of malware including spyware, trojan horses,etc. Below is a list of potential other threats and types of attacks to businesses:

Spyware remains a growing concern for businesses. In light of recently introduced data protection theft and loss will remain high for the foreseeable future.

Phishing (both web- and e-mail-based) is probably the worst current threat and will continue to remain so during the next year. It is also one of the most dangerous because it causes direct losses to victims (stolen bank accounts usually get "cleaned out" within hours or days).

A computer worm is a self-replicating program that ‘survives’ on its own (does not rely on attachment).

A spoofing attack is a situation in which one person or program successfully masquerades as another by falsifying data and thereby gaining an illegitimate advantage

Denial-of Service attacks exploit known vulnerability in specific apps, OS, protocols, or services and deny authorised users access to information or computers, e.g. web sites

In the Man-in-the-middle attack the attacker (M) makes the 2 parties (A and B) believe that they are talking directly to each other and relays (injects) messages between them.


Security providers are normally focused on protecting computer applications. With computer literacy increasing dramatically and the line between private and business use of computers and networks blurring. Businesses need to keep a close eye on their employee’s activities on their company networks and ensure that their network security is not at stake as today's biggest and most prominent emerging threats which are targeted at the emerging online lifestyle. For businesses, firewalls remain a mainstay of network security when dealing with automated threats such as worms or botnets, when coupled with strong antivirus protection both at server and client levels.

Neural Networks and Business

Firstly Neural networks (NNs, ANNs) are biologically inspired and operate like a human brain. They are capable of learning, self-organisation and have a tolerance to error and noisy data. They are a set of connected artificial neurones (nodes) along with the weights for connections and they operate concurrently and collectively. Neural networks provide significant benefits in business applications. They are actively being used for such applications as bankruptcy prediction, predicting costs, forecast revenue, processing documents and more There are many reasons why they should be used in a business. Below I have listed some of the common applications of neural networks in business.

(1)Detecting common characteristics in large amounts of business data is a type of classification problem. Neural networks can be used to solve classification problems, typically through Multi-Layer Perceptron (MLP) and Support Vector Machines (SVM) type networks. Examples of classification applications in business include dividing research populations or data into groups for further study. For example, data can be extracted from databases to determine potential business ventures for investors.It can also be designed to assist the sales part of a business to predict the expected revenue range of a movie or new CD before its release. Results have demonstrated that the neural networks can predict the success or sales better than other statistical methods currently employeed.

(2)Forecasting the relationship between multiple factors in business data is a type of function approximation problem. Neural networks can be used to solve function approximation problems, typically through Multi-Layer Perceptron (MLP), Radial Basis Function (RBF) and CANFIS (Co-Active Neuro-Fuzzy Inference System) type networks. Examples of function approximation in business include predicting changes to prices and costs. For example, data from studies can potentially help predict bankruptcy predictions for credit risk or sales forecast.

(3)Forecasting the relationship between multiple factors in business data is a type of time-series prediction problem. Neural networks can be used to solve time-series problems, typically through Time-Lagged Recurrent (TLRNN) type network. Examples of time-series predictions in business include forecasting revenue and expense cost. For example, data from business studies can predict labor, cost, material, utilities, or other cost over time. By determining cost factors for engineering and business decisions you could provide better estimations towards the manufacturing process.

(4)Identifying characters in images or video feeds in business is a type of image processing problem. Neural networks can be used to solve these image processing problems, typically through Principal Component Analysis (PCA) type network. Examples of image processing in business include identifying OCR (Optical Character Recognition) and biometrics in images. For example, image data from business studies can scan business cards information to be directly inputted into contact managers such as Outlook and PDA devices.

(5)Grouping of business data based on key characteristics is a type of clustering problem. Neural networks can be used to solve clustering problems, typically through Self-Organizing Map (SOM) type network.Examples of clustering in business include the detection of key characteristics in demographics and feature extraction. For example, data from studies concerning credit risk can be evaluated extracting different rules for determining credit risk. Neural network decisions can clarify by explanatory rules that capture the learned knowledge embedded in the networks can help the credit-risk manager in explaining why a particular applicant is classified as either bad or good.

Wednesday, October 28, 2009

Genetic Algorithms

A Genetic Algorithm searches for the possible solutions to problems using mechanics of natural selection and genetics. The results obtained from using genetic algorithms can be either good, bad or infeasible so in order to get the best solution a fitness value is added to the solution. This evaluates the fitness or “suitability” of each chromosome that makes up the population. A chromosomes represent the solutions within the algorithm which are randomly created at the beginning of a run of a genetic algorithm. The genetic algorithm then selects chromosomes from a population and combines them to produce new "offspring" chromosomes. These offspring chromosomes form a new population (or replace some of the chromosomes in the existing population) in the hope that the new population will be better than the previous ones. The chromosome is then able to store the solution which it represents and this is called representation.

Genetic algorithms produce new chromosomes by combining existing chromosomes. This operation is called crossover. A crossover rate takes parts of solution encodings from two existing parent chromosomes and combines them into a single new chromosome. This operation depends on the chromosome representation, and can be very complicated. Below is an of the crossover operation.

Given two chromosomes

10001001110010010

01010001001000011

Choose a random bit along the length, say at position 9, and swap all the bits after that point

so the above become:

10001001101000011

01010001010010010

After a crossover operation is performed a mutation operation is started.This mutation created small changes to an encoded solution.these mutations and crossover depend on the type of representation that is selected.

Fitness operations and fitness comparators are also used to manipulate chromosomes. A fitness operation measures the quality of the chromosome which is produced so that the genetic algorithm is told what to optimize. Fitness comparators compare chromosomes based on their fitness and tells the genetic algorithm whether it should minimize or maximize the fitness values of chromosomes.

Another step performed by the genetic algorithm is the production of the new introduction of new chromosomes into a population. These new chromosomes can replace an entire population or just a few chromosomes in the population. The algorithm ends when the best solution has not changed for a preset number of generations.

Tuesday, October 27, 2009

Risk Management

Risk management is a logical process or approach that seeks to eliminate or at least minimize the level of risk associated with a business operation. Essentially, the process identifies any type of situation that could result in damage to any resource within the possession of the company, including personnel, then take steps to correct factors that are highly likely to result in that damage.There are six major processes involved in risk management;they are,risk management planning,risk identification,qualitative risk analysis,quantitative risk analysis,risk response planning and risk monitoring and control.

(1)Risk management planning is the process of deciding how to approach and plan for risk management activities for a project and the main output of this is a risk management plan.A risk management plan documents the procedures for managing risk throughout the project.Several planning meetings should be held in the early in the projects life cycle to help develop a risk management plan.In addition to a risk management plan many project also include contingency plans fallback plans and contingency reserves.
A risk breakdown structure is a useful tool that can help project managers consider potential risks in different categories.It is hierarchy of potential risk categories for a project.

(2)Risk identification is the process of determining what potential events might hurt or enhance a particular project.Identifying potential risk early is important but you must also continue to identify risks based on the changing project environment.There are five common information gathering techniques which include brainstorming,The Delphi Technique,Interviewing,Root cause analysis and SWOT analysis.The main output of this process is the risk register which is a document that contains results of various risk management processes often displayed in a table or spreadsheet format.

(3)Qualitative risk analysis involves accessing the likelihood and impact of identified risks.A probability/impact matrix or chart lists the relative probability of a risk occurring on one side of a matrix or axis on a chart and the relative impact of the risk occurring on the other.Each risk is then listed as being high,medium,or low in terms of its probability of occurrence and its impact if it did occur.
Top ten risk item tracking is a qualitative risk analysis tool,and in addition to identifying risks,it maintains an awareness of risks throughout the life of a project by also helping to monitor risks.

(4)The main technique for quantitative risk analysis include data gathering,quantitative risk analysis,and modeling techniques.A decision tree is a diagramming analysis technique used to help select the best course of action in situations in which future outcomes are uncertain.Expected monetary value (EMV) is the product of a risk event probability and the risk event's monetary value.To create a decision tree and to calculate EMV specifically you must estimate the probabilities or chances of certain events occurring.
Simulation uses a representation or model of a system to analyze the expected behavior or performance of the system.Most simulations are based on Monte Carlo analysis.

(5)Risk response planning involves taking steps to enhance opportunities and reduce threats to meeting project objectives.Using outputs from the preceding risk management processes, project teams can develop risk response strategies that often result in updates to the risk register and project management plan as well as risk related contractual agreements.

(6)Risk monitoring and control involves monitoring identified and residual risks,identifying new risks,carrying out risk response plans and evaluating the effectiveness of risk strategies throughout the life of the project.The main outputs of this process include recommended corrective and preventive actions,requested changes,and updates to the risk register,project management plan,and organizational process assets.

Friday, October 16, 2009

The History of AI

Egyptian folklore refers to robots living amongst humans but they imagined them to be made out of stone. The idea of creating life like machines is an old one that has raised a number of ethical concerns.Then in 1640, Descartes argued that although machines can pass as animals, they can never pass as humans.

Years later in 1822, Charles Babbage invented the difference engine which some regard as the first computer. 120 years later in 1941 the first electronic computer was invented. This invention revolutionized every aspect of storing and processing information.

The name Artificial Intelligence came from John McCarthy in 1956 when he ran the ‘Dartmouth Summer Research Project on Artificial Intelligence’ which brought together the founders of AI. This project aimed to understand how the human brain works in order to program a robot to adapt to its environment.In 1958 McCarthy also created the LISP language. LISP stands for LISt Processing, and it was the language of choice among most AI developers.

In 1963 the United States government gave MIT 2.2million dollars to be used in researching AI. This was given so that the US would stay ahead of the Soviet Union in technological advancements.

In 1971 Terry Winograd created ‘SHRDLU’ at MIT to solve spatial problems and logic problems. It proved that a computer is capable of understanding and interpreting any pre defined world.

During the 1980's the computer impact hit the world as companies learned how much money they could save with the efficiency of computers. In 1986, US sales of AI hardware and software surged to $425 million. Many of the big companies were using XCON, an expert system designed to program the large VAX computers. Although in 1986-87 the demand in AI systems decreased, and the industry lost almost a half of a billion dollars.

AI slowly recovered and in 1991, DARPA reported that an AI-based logistics planning tool, DART, was used in military operations Desert Shield and Desert Storm which repaid decades of research. In 1996 the tamagotchi was introduced and became one of the most popular toys of the 90's. In 1997, IBM developed an Expert system named Deep Blue which showed the world the potential of computer software when it beat Gary Kasparov world chess champion in a game of chess.

The New Millennium brought new games and computerised toys to the world, pet robots that can learn and Lego Mindstorm to allow children to begin programming and build robots.

AI continues to grow and who knows what new technology will be introduced in the near future.All that is certain is that we have not seen the last of new developments within AI.

Monday, October 12, 2009

Social networking:Is it all just a fad or is it a profitable business?

Social networking sites are very popular nowadays. Almost all online users, especially the younger ones, are members and users of one, two or more social networking sites operating actively. Many Internet based businesses are also targeting social networking sites to take advantage of the huge volumes of user traffic.

Social networking sites are web pages that facilitate for friendly and active interaction among members. Most social networks are Internet based and aim to provide various and interesting means on how uses can interact. Such features may include instant messaging, video calling, chat, file sharing, discussion groups, voice chats, emails, blogging and so on. The most popular social networking sites are Facebook, MySpace, Twitter and Bebo.

There are loads of different opinions to whether social networks are profitable. In my opinion you can use social networks like Facebook to enhance your own business opportunities. Social networking allows you the chance to more subtly promote your products or services through a friendly environment. By promoting your products or services through a social networking environment, it really becomes more like a matter for friends sharing insights and information with friends. It allows you the ability to approach this friendlier marketplace on an international level. Many social networks are attracting people from around the world.Finaly if you so desire, social networking allows you the ability to target in on specific groups of individuals that you think would be most interested in the products or services that you actually do have to offer.

Of course there is a big difference between profiting from Facebook and from Facebook being a profitable organisation. One of the problems I see with Facebook, Twitter, MySpace, etc is that the revenue model seems to be an afterthought. They were able to get over the hump of getting millions of users, but once they got them, they said "Now what?" and flocked to advertising.

I don't think there is one simple answer to the revenue question, but I will say this; I wouldn't worry too much about the 8 million in the "We will not pay to use Facebook" groups those same 8 million are the ones who are in the "Facebook needs to go back to the old design, or we're leaving" groups. As long as their friends are there, there is no comparable social networking site and Facebook maintains some free version or very cheap version of their site, people will stay.

Tuesday, October 6, 2009

Cloud Computing

Cloud computing is a general term for anything that involves delivering hosted services over the internet with a goal to provide easy, scalable access to computing resources and IT services. It is a paradigm in which information is permanently stored in servers on the internet and cached temporarily on clients that include desktops, entertainment centres, table computers, notebooks, etc. The term “Cloud Computing” comes from the symbol usually used to depict the Internet on network diagrams.

Clouds can be either public or private. Public clouds sell services to anyone on the internet; it describes cloud computing in the traditional mainstream sense, whereby resources are dynamically provisioned on a self-service basis over the Internet. A private cloud is a network that supplies hosted services to a limited number of people. There is a big push for cloud computing services by several big companies. Amazon is currently the largest public cloud on the internet.

Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides resizable compute capacity in the cloud. It is designed to make web-scale computing easier for developers. Amazon EC2’s simple web service provides you with complete control of your computing resources and lets you run on Amazon’s computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes giving developers increased functionality. Amazon EC2 allows you to pay only for capacity that you actually use and Amazon EC2 provides developers the tools to build stronger applications which are less likely to fail.

In cloud computing models, customers do not own the infrastructure they are using; they basically rent it, or pay as they use it. This is one of the downfalls of cloud computing but it can be out-weighed by some of the following positives. One of the major selling points of cloud computing is lower costs. Companies will have lower technology-based capital expenditures, which should enable companies to focus their money on delivering the goods and services that they specialize in. Cloud computing is also thought to improve reliability and scalability. One of the major topics in information technology today is data security. In a cloud infrastructure, security typically improves overall although sensitive data is at a higher risk. Finally, cloud computing results in improved resource utilization, which is good for sustainability.