Friday, September 12, 2008

Thursday, June 19, 2008

Web 2.0 And Why You Shouldn't Fake Reviews

by: Simon Dance

The latest offering from Ramsay's Kitchen Nightmares aired on Channel 4 last night, followed the somewhat disastrous adventures of ex-boxer Mike and his wife Caron Ciminera as they struggled to run the Fish & Anchor, a restaurant in Lampeter, West Wales. Whilst the couple's arguing appeared to better the food they were originally sending out (a mix of jarred sauces and home cook book trophy dishes) they did let slip on a fantastically poor bit of black hat optimisation, which I hope made all white hat SEOs laugh out loud. If there was one lesson to take away from the show, it would be - Don't fake reviews! In order to gauged the feeling of the local community for the failing restaurant come sports bar, Ramsay conducted a search on Google for the Fish & Anchor, to which he was presented with a range of reviews, two of which were rather suspiciously from a character calling himself Michael or Mike Burns. On the Wales portal of the BBC website Burns had posted "Well i don't get excited about food too often, and having dined in Rick Stein's, and Gordon Ramsay's,I think i have found a better restaurant in West Wales". On the SugarVine website he also posted "what a fantastic restaurant for couples, and families. it seems to have everything, the food has to be the best i have eaten (home or abroad) this place will go far". Other online reviews echoed what has already been said, but with the dire state of the restaurant, its food, its reputation and its perception from both the local community and Ramsay itself, would it not be right to question who was telling the truth? The restaurateur confessed to posting the reviews, his rational pointing to stimulating custom, however with any reactive strategy it requires a degree of foresight - and I am not sure he really thought through the wider ramification of posting these "inaccurate" reviews. Firstly, a warning must be expressed. For example, if someone finds your restaurant or hotel via a positive (fake) review and they have a bad experience, there is a chance that they will post a true review to assist fellow users and generally have a rant. The initial seeding of this true review has the potential to lead to an onslaught of further reviews from other visitors who might not have otherwise posted. Don't forget the saying "people don't lead... they follow". But how can you manage your reviews and ultimately what your customers are saying about you? Well first and foremost, address the problem(s)! You wouldn't put a sticking plaster on a gun shot wound, so why think that a positive review about the quality of your food or the softest of your sheets is going to counteract the adversities of your customer service? The customer is king, a point stressed by Ramsay, and one that should ring true for any business, after all, without them, where would we be? By rectifying or at least making plans to manage any failings within your business, regardless of its size, will be the first step in managing your online reputation, but this is an area I will not going into comprehensive detail for this post. Instead, I will offer some simply pointers as to how to harness online reviews for good. Sites like Trip Advisor, which boasts over 10,000,000 user generated reviews of various hotels, holidays and restaurants is gaining increasing weighting as an resource for honest and unbiased review and via its system of community recommendation it really has the power to drive custom, and in many instances, divert customer - the key factor being positive, and consistent reviews. But if you do run a successful hotel or restaurant and wish to harness these social spaces, but wish to do so in a more ethical way than that demonstrated in Kitchen Nightmares than why not encourage your diners of hotel guests to post a review after their stay. When the customer is paying their bill or even booking their hotel room why not take their email address, or even ask them to submit their business card in return for entry into a monthly prize draw for a free meal in the restaurant? In addition to building up a client database by collecting this data - for use in promotional mailings including notifying customers of events, promotional and the launch of a new menu - you can also harness it to stimulate online reviews by dropping your customers a short email after their stay / meal, which might look something like the following example... "Good afternoon Simon, and thank you very much for your booking at the Leapfrogg Restaurant, we hope you had an enjoyable meal. We pride ourselves on the quality of our food and our attentive staff however we're always striving to enhance and improve what we do, and as such we would appreciate you taking two minutes of your time to write a review for us at Trip Advisor (http://www.tripadvisor.com), a free travel guide and research website that allows users to post review and ratings. Your comments are important to us, and will be used to improve the Leapfrogg restaurant. Thank you very much for your time and we look forward to welcoming you again to the Leapfrogg restaurant in the near future. Sincerely, A Restaurateur Leapfrogg restaurant Brighton Tel: 01273 669 450" Of course, many of your requests will be ignored, but providing you are personal in your emails (a point we at Leapfrogg have mentioned previously in this blog) then you are more likely to get a response, and even if you only have a 5% success rate, this is still 5% of valuable customer feedback. A point to which I will conclude this article is one which has stuck with me from London's SMX, and one that I will most certainly be repeating from here on out is that "Yesterday's news no longer wraps today's fish and chips". Online news and online content, including user generated reviews do not simply get binned like a newspaper at the end of the day, but they remain live, and can even appear within the search results for a brand keyword search... so isn't it worth paying attention to what your customers are saying?

A Guide on RSS Tool

by: Terry Leslie

RSS is an abbreviation that has evolved into the following, depending on their versions: • RDF Site Summary (also known as RSS 0.9; the first version of RSS) • Rich Site Summary (also known as RSS 0.91; a prototype) • Really Simple Syndication (also known as RSS 2.0) Today, RSS stands for 'Really Simple Syndication', and it has the following 7 existing formats or versions: • 0.90 • 0.91 • 0.92 • 0.93 • 0.94 • 1.0 • 2.0 RSS tools refer to a group of file formats that are designed to share headlines and other web content (this may be a summary or simply 1 to 2 lines of the article), links to the full versions of the content (the full article or post), and even file attachments such as multimedia files. All of these data is delivered in the form of an XML file (XML stands for eXtensible Markup Language), which has the following common names: • RSS feed • Webfeed • RSS stream • RSS channel They are typically shown on web pages as an orange rectangle that usually has the letters XML or RSS in it. RSS feeds can be used to deliver any kind of information. Some of these 'feeds' include: • Blogs feed - each blog entry is summarized as a feed item. This makes blog posts easier to scan, enabling 'visitors' to zoom in on their items of interest. • Article feed - this alerts readers whenever there are new articles and web contents available. • Forum feed - this allows users to receive forum posts and latest discussion topics. • Schedule feed - this allows users (such as schools, clubs, and other organizations) to broadcast events and announce schedule changes or meeting agendas. • Discounts or Special feed - this is used to enable users (such as retail and online stores) to 'deliver' latest specials and discounted offers. • Ego or News Monitoring - this enables users to receive 'filtered' headlines or news that are based on a specific phrase or keyword. • Industry-specific feed - used by technical professionals in order to market, promote, or communicate with current (and prospective) customers and clients within their specific industries. RSS feeds enable people to track numerous blogs and news sources at the same time. To produce an RSS feed, all you need is the content or the article that you want to publicize and a validated RSS text file. Once your text file is registered at various aggregators (or 'news readers'), any external site can then capture and display your RSS feed, automatically updating them whenever you update your RSS file. RSS tools are useful for sites that add or modify their contents on a regular basis. They are especially used for 'web syndication' or activities that involve regular updates and/or publications, such as the following: • News websites - as used by major news organizations such as Reuters, CNN, and the BBC. • Marketing • Bug reports • Personal weblogs There are many benefits to using RSS feeds. Aside from being a great supplemental communication method that streamlines the communication needs of various sectors, RSS tools and feeds can also have tremendous benefits in your business, particularly in the field of internet marketing. RSS tools and feeds provide Internet users with a free (or cheap) and easy advertising or online marketing opportunity for their businesses. Below are some of the RSS features that can help make your internet marketing strategies more effective. 1. Ease in content distribution services. With RSS, your business can be captured and displayed by virtually any external site, giving you an easy way to 'spread out' and advertise them. 2. Ease in regular content updates. With RSS, web contents concerning your business can now be automatically updated on a daily (and even hourly) basis. Internet users will be able to experience 'real time' updates as information in your own file (such as new products and other business-related releases) is changed and modified simultaneously with that of the RSS feeds that people are subscribed to. 3. Custom-made content services. With RSS, visitors can have personalized content services, allowing them total control of the flow and type of information that they receive. Depending on their interests and needs, visitors can subscribe to only those contents that they are looking for (such as real estate or job listings). 4. Increase in (and targeted) traffic. With RSS, traffic will be directed to your site as readers of your content summary (or 1 to 2 lines of your article) who find them interesting are 'forced' to click on a link back to your site. These are just several of the many things that you can do with RSS. The possibilities are endless, and they are all aimed at providing you with an effective internet marketing strategy for your business. In the mean time, Good Luck on your journey to success… OR if you would like to succeed immediately to create financial freedom working only 4 hours a week, check out www.secrets2internetfortunes.com. AND for a Limited Time, you will also receive a FREE copy of a limited number of the amazing 60 page eBook “52 Highly Profitable Instant Online Business Ideas That You Can Steal As Your Own And Start Today On A Very Tight Budget!”, which is jam packed with so many ideas you can use to instantly create an automated income for life! That’s my GIFT to You as a way of saying thank you for reading my articles.

Symantec Norton Antibot The Latest In Norton Computer Protection Software

by: Lisa Carey

It seems like every other month a new “program” comes along to make our lives that much easier. For example, first we could bookmark favorites, and then RSS feed them, and then came widgets and now “bots” which are robots that do a lot of our computer work for us in the background. Examples of friendly bots are weather bots, game playing bots, instant messaging and my favorites are those on AOL Instant Messenger which do all kinds of functions for me like shop, find movie times and even give updates on the Wall Street Journal. Unfortunately not all bots were created “equal.” Some are friendly and some are not. The ones that are not friendly can be a form of malware that allows control of your computer to be released, providing hackers with the opportunity to access your information and spread harmful bots to others. This type of computer virus can then be used to spread spam and commit various types of identity theft and other online fraud. So with new threats to our computers and information, new methods of protection are required. One of the oldest and most well known software protection designers has recently released a new protection program, Symantec Norton AntiBot. This is a software product designed to prevent the hijacking of one’s personal computer by bots and uses the bots on design programs against them, to located and destroy them. Many people already employ some form of protection on their personal computer, such as increasing the protection level from internet information to “high.” But these cannot detect some of the most recent bot programs and may not be the most efficient means of information protection, especially with the Internet being used more and more frequently for online shopping, ticket purchases, travel and other “high risk” activities. A more effective method of detecting and eliminating threats caused by bots is to install software designed specifically to detect, destroy and prevent bots from having access to your computer. With Symantec Norton AntiBot software, protection against bots is enhanced several times and the threat of bot attack is greatly diminished. It’s program protects against bots by blocking bots from entering your computer through downloads and e-mail attachments (two of the common ways bots enter a personal computer), checking for any unusual behavior on your personal computer and eliminating it, and detecting malicious bot software at all levels; keeping your personal, financial and credit card information safe and stopping identify theft before it can occur. Because bots operate in the background and are not detectable by antivirus or antispyware programs, many computer users are completely unaware that their personal computer has become infected. Many problems caused by bots go undetected until it is too late. Warning signs that your computer may have been accessed include: slowness of computer speed and unusual or irrelevant error messages. However, many times com these symptoms are sporadic and computer users will take little notice. Many people will continue to use their personal computer, unaware that bots have hijacked their personal computer and are slowly at work; looking for credit card numbers, passwords, and logon information which can be used for identity theft and in committing other types of online crime. This program scans your personal computer on a continuous basis, closing the gaps that could allow bots to infect your personal computer and better ensuring that bots do not invade and gain control. The use of Symantec Norton AntiBot to determine what a harmful or useful bot and allows you to continue using those bots you love and have come to depend on for information and services. It can be used in addition to several other antivirus and antispyware programs. Its compatibility is not limited to only Norton products. The cost of this software is $29.95 for one year of service. It was awarded PC Magazine’s Editor’s Choice Award (2007) and underwent rigorous testing which included using AntiBot on computers with existing threats as well as allowing threats to try to access the computer after installation. With the growing threat of identity theft and credit card fraud Symantec Norton AntiBot offers an additional level of protection needed to combat the threat of bots and prevent them from turning one’s personal computer into a robotic that turns into an instrument of destruction to both your personal and financial well-being.

Record Your Products: Reap The Rewards of Recording And Getting Your Product Done Faster And Easier.

by: Patsy Bellah

Some of you will remember when we had to type on typewriters. Some of you, present company included, may even remember when we had to type on “standard” or manual typewriters. For those who aren’t in the know, that’s a typewriter without electricity Then we got electric typewriters. That was something new to learn, but all our work could be done faster, easier and with less mess. Then came computers. There was more to learn but with this technology life was made even easier for secretaries, writers, or anyone having to convey information with the written word. With each of these advances there were those who said they couldn’t do it. They didn’t like it, they didn’t like change. They could get along just fine, thank you very much, with a manual typewriter, or an electric one. They didn’t need computers. There was too much to learn. It was too different. Don’t let that attitude keep you from learning the latest time saver for transferring words to paper and that is the digital recorder. As the manual typewriter has given way to more sophisticated electric typewriters, which have given way to the computer, so, too, has the digital recorder made it faster and easier to transfer the spoken word to the written word. On the average a one-hour recording will yield about 20-30 typewritten pages. That means that with a one hour “conversation,” speaking your story or information into a recording device, then getting it transcribed, you can transfer your spoken word to a document in about 25% of the time it would take you to type it yourself. It may take a bit of practice to learn to dictate into a recorder, but once you have, you will find that you can save yourself a ton of time. Statistics prove that the longer it takes to complete a project, the less likely it is that you will finish it. Embrace this new technology. Here are some guidelines you should consider when purchasing a digital recorder: 1. You must be able to download your recording to your computer. Some of the less expensive recorders are not “downloadable.” You need to be able to transfer your recording through the Internet in order to send it to a transcription service or even if you want to transcribe it yourself. 2. Although most recorders come with internal microphones, it is best to have the capability to attach an external microphone. External microphones work better to record presentations or to record from a distance. Additionally, you can elect to use a lavaliere microphone for yourself and not be hampered with holding the recorder. Or, if you are recording more than one person, such as if you are interviewing someone, you can get an attachment which allows you to hook up two microphones. 3. The recorder should have at least four hours of available recording time using the high quality recording setting. You want to make sure the recorder has enough time to record a full presentation before having to be downloaded to the computer. The capabilities of recorders change all the time, and in my recent research I found that the prices, like anything else, are coming down drastically and we are getting more and more recording time. I checked out the Olympus recorder on the Internet and found a very good quality recorder for around $100.00. I also found that you could buy this at Best Buy in the Los Angeles area at the same price. Other locations such as Samy’s Cameras for those in the Los Angeles area, Circuit City, Radio Shack and Frys may also have them. For those of you who live in the Los Angeles area, I found an Olympus and a Marantz at Samy’s Cameras which uses a flash card and can get you as much as 4G-8G of storage space. Both of these sell for just under $400.00. The Sony or the Edirol are also good recorders, and have similar capabilities and prices. Buying a recorder is much like buying a blender or a computer. Although it’s wise to buy as much as your pocketbook allows, at the same time, you don’t need to buy more than you will use. Why spend the extra money. A digital recorder is small and easy to use. On it you can record all of your information products, plus your presentations, blogs or articles. Embrace this new technology. Using a digital recorder to record your information product, presentations or teleseminars, will allow you to finish your product in less than 25% of the time it would take you to type it yourself. If you get your audio transcribed, once you get it back, all you have to do is edit it and you can have your product completed in less than a week.

Internet And Business Online – The Act Of Interdependence

by: Scott Lindsay

The best role of business online is that of interdependency. We’ve all heard the old saying, “No man is an island.” When it comes to online business this is especially true. If a business owner who takes their business into the online world determines they will be self reliant and never accept the help of anyone then that individual will not be in business long enough to change their minds. It is accepted fact that the greatest tool for long-term exposure to your website is through Search Engine Optimization (SEO). Without it potential customers can’t find you. It is unreasonable to expect that you can adequately develop a website without optimizing your website for the best possible search engine ranking. Search engines also place a high value on sites that have links placed on existing sites. These ‘backlinks’ demonstrate to search engines that others trust your site. By placing your link on their website these other businesses indicate a trust and recommendation for your site. In effect the two strategies listed above rely exclusively on what others can do for you when it comes to your online business. Shirley Temple once proclaimed in her movie Rebecca of Sunnybrook Farm, “I’m very self-reliant.” American westerns are filled with lines dealing with pulling yourself up by your bootstraps and holding down the fort. Many of us have grown up to believe if we want something done right we have to do it ourselves. This thinking is in opposition to the rules associated with an online business. The online world can only exist because people share. Individuals share technology, but the also share links, reviews, blogs, forums and a wide range of other marketing strategies that find a commingling of interdependency. In online business you are as dependent on others as they may be on you. Unlike the word ‘dependent’, the term interdependent indicates a mutual dependency. In other words you are depending on others to help provide links back to your site while they are equally dependent on you (or others) for the success of their business. Have you really taken a proactive approach to networking? It’s possible you are reading this today and you’ve never considered asking someone else to place a link to your site on his or her online business site. It can feel awkward depending on others to achieve online success especially if you’ve been lead to believe reliance on others is also a sign of imposing on their otherwise brilliant generosity. I suppose it could be a deep-seated sense of pride that makes it hard to consider the need to ask others for help. However, the truth is depending on others is really what has made the Internet possible. The growth of this online world is comprised of a link of computers, networks and servers that are connected in a way that provides the maximum benefit for all. Building an online business can feel a bit like trying to build a house of cards. Without the ability to rely on the other ‘cards’ around you it is virtually impossible to build. Interdependence. This is the essence of online business.

Web Development And The Big Time Out

by: Scott Lindsay

One of the great debilitators in online business is simply the perceived (or real) lack of time. Business owners are used to moving forward. An online web presence can make them feel tied to an office chair learning skills they aren’t sure they want to know. It’s not uncommon for those who deal in full time web design to have individuals contact them for a site design, but have absolutely no idea what they want. Furthermore when the designer questions them the response might be, “I don’t know, just make it look nice.” Let’s not forget the core values or mission of the business. Many business owners have no idea how to answer those kinds of questions. They may stare blankly for a moment or two and there’s no more time for further deep thought so they go back to action – without answers. In many cases it is possible to answer some of the questions needed, but it may require taking time away from a familiar setting. It may also require more time than you think you want to give. If you can get to a place of concentrated contemplation you are likely to find yourself stripping ideas to their core to find out what your business is trying to accomplish and what your ultimate goals might be. As with almost any project you can turn frustration around if you will just take the time to come to terms with your vision. Sometimes we spend so much time ‘doing’ we never stop to ask the question, “Why?” This process can be a bit like taking a bus that drives around the park. You keep looking at the flowers and the park bench and long to sit in the quiet shade of a tree and just absorb the calming atmosphere. You know they will have a positive effect on you, but for some reason you just can’t seem to find the energy to get off the bus. It seems to me there are some sites that are misguided or rarely guided that could benefit from the process of self-evaluation. These sites may look nice, but there is a sense of disconnection that may not be easy to identify, but it’s fairly obvious to visitors. Creative energy is at a minimum while business owners simply tackle what seem to be the most urgent details. As more people gravitate to online business there needs to be a shift in the thinking of how one goes about doing business online. In many ways it can’t be approached in the same way a traditional business is developed, yet that is typically the way many new web commerce ventures choose to tackle the subject. You may discover your business will be more successful if you take some time for rigorous reflection. The time set aside can be a bit like an architect that takes the time to develop plans for a new building. You wouldn’t expect the architect to simply tell a construction crew to, “Go out there and build – something.” Work at ‘building’ your online business in a comprehensive way. Your effort can develop a firm foundation for long-term success.

Back to Back User Agents for Telecommunications

by: Danny Loeb

Today’s telecommunications networks are a delicate blend of clients and servers that together offer virtually endless possibilities when it comes to services and applications. For every new client developed, there seems to be a score more on the way — from mobile handsets, PDAs, terminals, telephones, video phones, IP set-top-boxes, and so on. There are essentially two types of servers that connect between clients on large networks: Proxy servers and Back-to-Back User Agent (B2BUA) servers. The more prevalent Proxy servers feature predictable behavior — simply connecting between clients. Effectively, B2BUA servers are much stronger and intelligent entities that perform actions which Proxy servers cannot. Moreover, B2BUA servers provide a flexible solution for a wide range of applications and services and are becoming the primary engine for more and more SIP servers in NGN and IMS networks. The difference between Proxy servers and B2BUA servers is sometimes not fully understood. In this article, we will explore what makes B2BUA servers such an appealing alternative to standard Proxy servers. Better understanding of B2BUA servers can help managers understand the value, and the tradeoffs, of choosing a B2BUA server, as well as the frameworks needed to develop a wide range of SIP applications and SIP services using it. Figure 1 - Architectural difference between Proxy servers and B2BUA servers B2BUA Server Defined B2BUA servers are used to provide value added features for point-to-point calls and manage multi-point calls. The power behind a B2BUA server is derived mostly from the fact that it has a very generic definition, which gives it almost unlimited power. However, this same characteristic is the root of the controversy surrounding it. IETF standard (RFC 3261) defines a back-to-back user agent as “a logical entity that receives a request and processes it as a user agent server (UAS). In order to determine how the request should be answered, it acts as a user agent client (UAC) and generates requests. Unlike a Proxy server, it maintains a dialogue state and must participate in all requests sent on the dialogues it has established.” B2BUA servers have capabilities that far exceed those of other types of SIP servers, and answer the need for developing sophisticated value added SIP applications that cannot be implemented as Proxy applications. Some of these capabilities, which are unique to B2BUA servers, are outlined below: 3rd Party Call Control (3PCC) Features 3rd Party Call Control (3PCC) is the ability of an entity (usually a controller) to set up and manage communication between two or more parties. 3PCC is often used for operator services and conferencing. 3PCC actions are important capabilities, exclusive to B2BUA servers since “passive” non call-stateful elements, such as Proxy servers, cannot initiate these types of activities. Some examples of 3PCC services are online billing, QoS, resource prioritization, call transfer, click-to-dial, mid-call announcement and more. 3PCC actions can be initiated automatically by B2BUA server applications, like disconnecting a call following credit expiration in an online-billing system. Or they can be initiated by remote administrative control (OSS), e.g. invite parties to a multi-point conferencing session. Figure 2 - Schematic outline of B2BUA server offering 3PCC functionality Inter-working Function (IWF) for Interoperability SIP was designed as a highly flexible and extendible protocol. The very strength of this flexibility is also an inherent weakness, since the vast array of client types in the market still need to connect. B2BUA Inter-working Functions (IWF) defines a wide range of powerful SIP servers that connect SIP clients that “speak” in different protocol dialects, or support different capabilities. This Inter-working function is very important in enabling connectivity between clients with different capabilities and/or protocol dialects. Or even between clients and networks – where the B2BUA server actually acts as an access device. Examples of what IWF can do include: • Connecting SIP clients to IMS networks by adding and removing IMS SIP protocol extensions (AKA P-Headers) that are essential for connecting to the IMS network • Connecting clients with different Session Timers settings • Connecting clients with different media capabilities and with distinct Session Description Protocol (SDP) messages by relaying between the two types of control sessions • Connecting to different types of networks (e.g. IPv4, IPv6) and support for different transport types, such as TCP/UDP/SCTP/TLS Figure 3 - Schematic outline of a B2BUA Inter-Working Function Multi-point Call Management B2BUA servers an also implement multi-point call scenarios where multiple CPE devices connect to the B2BUA, and the B2BUA provides services to all CPE. Due to these unique capabilities, B2BUA servers are widely used in the communications industry. A few examples are listed below: • Online-billing/prepaid functions • Servers supporting Resource Prioritization (RP) and/or Quality of Service (QoS) features • Multi Point Conferencing servers • IVR servers • PBX Applications and Softswitches • Application Layer Gateways (ALG) • FW/NAT Traversal applications • Privacy servers • 3rd-Party Call Control Applications (3PCC) • Service Creation Environment (SCE) runtime engines • Session Boarder Controller (SBC) • IMS S-CSCF, P-CSCF, I-CSCF • SIP Inter-work Function (IWF) Gateway • Security Gateway (SEG) • Voice Call Continuity (VCC) servers In addition, B2BUA servers play an important role in emerging IMS networks. Recent releases of 3GPP IMS specifications (3GPP TS 24.229 V8.0.0) indicate that an increasing number of IMS network element servers, such as P-CSCF, IBCF,SBC etc., are B2BUA servers. The reason for this is that value added services are usually session stateful, and feature capabilities that go beyond basic call proxying. Applications written on top of B2BUA Application servers fulfill several roles, such as SIP User Agents, SIP Proxy servers and SIP Registrars. B2BUA Server Challenges B2BUA application developers face many challenges, such as achieving rapid time-to-market, conformance and interoperability, offering customization for proprietary services and support for High Availability (HA) and redundancy. A comprehensive B2BUA framework can help developers overcome these challenges. A solid B2BUA framework should have modular application building block architecture for increased flexibility, abstraction and short delivery time. Traditional architecture, which features a single configurable state machine, is not flexible enough. Also, a B2BUA framework should facilitate developing B2BUA applications by flexibly linking “pluggable” high-level Modular Application Building Blocks (MABB). Developers should have the ability to combine these MABBs and they should be designed in a way that allows developers to further customize their behavior if needed. This type of architecture complies with contemporary Service Oriented Architecture (SOA) concepts, and is suitable for powering flexible business communication platforms. This modular architecture can save months of work. With a set of MABBs in hand, developing the application is a matter of combining existing MABBs to produce the required business logic. In addition, this architecture enhances efficiency; development of new MABBs can be done concurrently. A B2BUA framework should facilitate developing applications that fully conform to standards and are interoperable; without restricting developers from customizing protocol behavior for special cases. Moreover, it should conform for non-standard implementations, as well as to mediate between two versions of the same standard. This type of framework allows developers to focus on their proprietary application with the confidence that their final application will be fully interoperable. And finally, a B2BUA framework should provide the ability to configure, amend and replace application building blocks to create proprietary features. With this ability, developers can maximize existing code – significantly reducing development time, shortening testing cycles, and reducing overall time-to-market. Figure 4 - Traditional architecture of a B2BUA framework RADVISION’s B2BUA Application Framework http://www.radvision.com/Products/Developer/SIPServer delivers these capabilities and more. The B2BUA Application Framework module is a part of the RADVISION SIP server Platform, a software framework that offers the essential building blocks for the development of a wide variety of high performance SIP and IMS servers. The rich set of components and modules can be flexibly combined to match customers’ requirements for developing SIP servers that offer both standard and advanced SIP services. Applications written on top of RADVISION’s B2BUA framework are developed by combining customizable modular application building blocks. This is effectively large chunks of functionality that can be strung together to form ad-hoc applications, enabling developers to focus on the high-level business logic and use building blocks that hide low-level details. As one of the most popular IM applications, Yahoo! Messenger was the first large consumer player that adopted B2B UA. Yahoo! Messenger combined its backend scalable platform with RADVISION’s B2B UA to serve millions of monthly unique messaging users around the world. Yahoo selected RADVISION’s B2BUA due to its robust performance and scalability features. Figure 5 - The architecture of RADVISION B2BUA Application Framework RADVISION also offers automatic High Availability (HA) and Redundancy support. The B2BUA framework automatically replicates the run-time state of the different Services and B2BUA framework core. In the event of a server outage, a redundant server takes over seamlessly and provides uninterrupted service continuity. B2BUA framework benefits in a nutshell • Significantly reduces time to market developing proprietary B2B applications and services. • Allows adding advanced services easily to retain competitive advantage and evolve to meet growing customer demands. • Focuses on the business logic and hides low level operator communication intricacies. • Delivers off-the-shelf conformance and interoperability. • Enables rapid development of applications that can interoperate with different vendors. • Enables adding high-availability features easily. Click here http://www.radvision.com/Resources/WhitePapers/b2bua.htm for more extensive information on B2BUA Servers. By Danny Loeb, RADVISION http://www.radvision.com Product Manager

The Battle of the Browsers – The History and the Future of Internet Browsers

by: Nicholas C Smith

With Internet Explorer 8 now available, can Microsoft hope to retain market dominance over fierce open source rivals such as Mozilla's Firefox or the feature packed Opera web browser. Can history give us a clue to what the future of web browsers/browsing might hold? How did Netscape Navigator go from having a dominant 89.36% market share of all web browsers in 1996 and yet only 3.76% by mid 1999? Let us take a journey that will begin long before even the intellectual conception of Internet Explorer, that will glance at its long defeated rivals, examine the current browsers available and will end with a prediction of what the future of browsing will offer us – and which browser(s) will still be around to offer it. People often think that Internet Explorer has been the dominant web browser since the golden age of the internet began. Well for a very long time now it has indeed been the most popular browser and at times been almost totally unrivalled. This was mainly a result of it being packaged free with Microsoft Windows, in what some would later call a brutal monopolisation attempt by Microsoft. The last few years however have heralded the arrival of new, possibly superior browsers. Mozilla's Firefox has been particularly successful at chipping away at Explorers market dominance. So where did it all begin, and why were Microsoft ever allowed to have a hundred percent market dominance? Origins The truth is they never did have total dominance, but at times they have come very close. Microsoft actually entered the Browser Battle quite late on. Infact a man named Neil Larson is credited to be one of the originators of internet browsers, when in 1977 he created a program – The TRS-80 - that allowed browsing between “sites” via hypertext jumps. This was a DOS program and the basis of much to come. Slowly other browsers powered by DOS and inspired by the TRS 80 were developed. Unfortunately they were often constricted by the limitations of the still fairly young internet itself. In 1988, Peter Scott and Earle Fogel created a simple, fast browser called Hytelnet, which by 1990 offered users instant logon and access to the online catalogues of over five thousand libraries around the world – an exhilarating taste of what the internet, and web browsers, would soon be able to offer. In 1989 the original World Wide Web was born. Using a NeXTcube computer, Tim Berners-Lee created a web browser that would change how people used the internet forever. He called his browser the WorldWideWeb(http://www., which is still likely to sound familiar to internet users today. It was a windowed browser capable of displaying simple style sheet, capable of editing sites and able to download and open any file type supported by the NeXTcube. In 1993 the first popular graphical browser was released. Its name was Mosaic and it was created by Marc Andreessen and Eric Bina. Mosaic could be run on both Unix, and very importantly, on the highly popular Microsoft Windows operating system (incidentally it could also be used on Amiga and Apple computers). It was the first browser on Windows that could display graphics/pictures on a page where there was also textual content. It is often cited as being responsible for triggering the internet boom due to it making the internet bearable for the masses. (It should be noted that the web browser Cello was the first browser to be used on Windows – but it was non graphical and made very little impact compared to Mosaic). The Browser Wars - Netscape Navigator versus Internet Explorer Mosaic's decline began almost as soon as Netscape Navigator was released (1994). Netscape Navigator was a browser created by Marc Andreessen, one of the men behind Mosaic and co-founder of Netscape Communications Corporation. Netscape was unrivalled in terms of features and usability at the time. For example, one major change from previous browsers was that it allowed surfers to see parts of a website before the whole site was downloaded. This meant that people did not have to wait for minutes simply to see if the site they were loading was the actual one the were after, whilst also allowing them to read information on the site as the rest of it downloaded. By 1996 Netscape had almost 90% market dominance, as shown below. Market Share Comparisons of Netscape Navigator and Internet Explorer from 1996 to 1998 ....................Netscape.......IE October 1998..........64%.........32.2% April 1998............70%.........22.7% October 1997..........59.67%......15.13% April 1997............81.13%......12.13% October 1996..........80.45%......12.18% April 1996............89.36%.......3.76% In these two years Netscape clearly dominated the internet browser market, but a new browser named Internet Explorer was quickly gaining ground on it. Microsoft released their own browser (ironically based on the earlier Mosaic browser which was created by one of the men now running Netscape), clearly worried about Netscape's dominance. It was not so much the worry that it would have a 100% market share of internet browsers on their Windows operating system, but more the worry that browsers would soon be capable of running all types programs on them. That would mean foregoing the need for an actual operating system, or at the most only a very basic one would be needed. This in turn would mean Netscape would soon be able to dictate terms to Microsoft, and Microsoft were not going to let that happen easily. Thus in August 1995, Internet Explorer was released. By 1999 Internet explorer had captured an 89.03% market share, whilst Netscape was down to 10.47%. How could Internet Explorer make this much ground in just two years? Well this was down to two things really. The first, and by far the most important was that Microsoft bundled Internet Explorer in with every new copy of Windows, and as Windows was used by about 90% of the computer using population it clearly gave them a huge advantage. Internet Explorer had one other ace it held over Netscape – it was much better. Netscape Navigator was stagnant and had been for some time. The only new features it ever seemed to introduce were often perceived by the public as beneficial for Netscape's parent company rather than Netscape's user base. (i.e., features that would help it monopolise the market). Explorer, on the other hand, was given much attention by Microsoft. Regular updates and excellent usability plus a hundred million dollar investment would prove too much for Netscape Explorer. 2000 – 2005 These years were fairly quiet in the Battle of the Browsers. It seemed as if Internet Explorer had won the war and that nobody could even hope to compete with it. In 2002/2003 it had attained about 95% of the market share – about the time of IE 5/6. With over 1000 people working on it and millions of dollars being poured in, few people had the resources to compete. Then again, who wanted to compete? It was clearly a volatile market, and besides that everybody was content with Internet Explorer. Or were they? Some people saw faults with IE – security issues, incompatibility issues or simply bad programming. Not only that, it was being shoved down peoples throats. There was almost no competition to keep it in line or to turn to as an alternative. Something had to change. The only people with the ability and the power to compete with Microsoft took matters into their own hands. Netscape was now supported by AOL. A few years prior, just after they had lost the Browser Wars to Microsoft, they had released the coding for Netscape into the public domain. This meant anybody could develop their own browser using the Netscape skeleton. And people did. Epiphany, Galeon and Camino, amongst others, were born out of Netscape's ashes. However the two most popular newcomers were called Mozilla and Firefox. Mozilla was originally an open sourced project aimed to improve the Netscape browser. Eventually it was released as Netscape Navigator 7 and then 8. Later it was released as Mozilla 1.0. Mozilla was almost an early version on another open source browser, Firefox. With it being an open source the public were able to contribute to it - adding in what features it needed, the programming it required and the support it deserved. The problems people saw in Internet Explorer were being fixed by members of the open sourced browser community via Firefox. For instance, the many security issues IE 6 had were almost entirely fixed in the very first release of Firefox. Microsoft had another fight on their hands. 2005 – Present Firefox was the browser that grew and grew in these years. Every year capturing an even larger market share percentage than before. More user friendly than most of its rivals along with high security levels and arguably more intelligent programming helped its popularity. With such a large programming community behind it, updates have always been regular and add on programs/features are often released. It prides itself on being the peoples browser. It currently has a 28.38% market share. Apple computers have had their own browser since the mid 1990's – Safari - complete with its own problems, such as (until recently) the inability to run Java scripts. However most Apple users seemed happy with it and a version capable of running on Windows has been released. It has had no major competitor on Apple Macs, and as such has largely been out of the Browser Wars. It currently holds a 2.54% market share and is slowly increasing. Internet Explorer's market share has dropped from over 90% to around 75%, and is falling. It will be interesting to see what Microsoft will attempt to regain such a high market share. Opera currently holds 1.07%. Mozilla itself only has a 0.6% market share these days. The Future of Web Browsing Web browsers come and go. It is the nature of technology (if such a term can be used), to supplant inferior software in very short periods of time. It is almost impossible for a single company to stay ahead of the competition for long. Microsoft have the advantage of being able to release IE with any Windows using PC. That covers over 90% of the market. They also have the advantage of unprecedented resources. They can compete how they wish for as long as they wish. So there is no counting IE out of the future of web browsing. Safari is in a similar position, being easily the most popular Mac web browser. Its long term survival is dependant upon Apple and the sale of their computers. These are the only two browsers that are almost guaranteed another five years of life, at least. Firefox may seem like another candidate, but the public is fickle, and one bad release, or if it seriously lags behind the new Internet Explorer 8 for long, could easily see its popularity quickly descend into virtual oblivion. However, it seems likely community driven browsers, such as Mozilla and Firefox, will be the only types of browser capable of competing with the wealthy internet arm of Microsoft in the near future. As for web browsing itself, will it change any time soon? Well it already has for some online communities. For example, if you want to buy clothes you could try entering an online 'world' creating an online virtual You to go from 'shop to shop' with, looking at products and trying/buying what you see. Some 'worlds' allow you to recreate yourself accurately including weight and height and then try on things apparel such as jeans to give you an idea of how you would look in that particular item. Will 'worlds' like this destroy normal web browsers such as IE ? - It seems unlikely. Traditional web browsers provide such freedom and ease of access that it is hard to see any other alternative taking over. However they are part of the new, 'thinking out of the box' wave of alternatives that some people will find attractive, and really who knows what the future will bring.

Can Data Breaches Be Expected From Bankrupt Mortgage Lenders?

by: Tim Maliyil

The stock market is in a tumult. Actually, it has been for about a year, ever since the subprime fiasco (anyone take a look at Moody's performance over the past year?) Now that that particular issue has been beaten to death, other mortgage related issues are cropping up. Most of the stuff covered in the media is financial in nature, but some of those mortgage related issues do concern information security. It's no secret that there are plenty of companies in the US that discard sensitive documents by dumping them unceremoniously: leave it by the curb, drive it to a dumpster, heave it over the walls of abandoned property, and other assorted mind boggling insecure practices. In fact, MSNBC has an article on this issue, and names numerous bankrupt mortgage companies whose borrowers' records were found in dumpsters and recycling centers. The information on those documents include credit card numbers and SSNs, as well as addresses, names, and other information needed to secure a mortgage. Since the companies have filed for bankruptcy and are no more, the potential victims involved have no legal recourse, and are left to fend for themselves. In a way, it makes sense that companies that have filed for bankruptcy are behaving this way. (Not that I'm saying this is proper procedure.) For starters, if a company does wrong, one goes after the company; however, the company has filed for bankruptcy, it is no more, so there's no one to "go after." In light of the company status, this means that the actual person remaining behind to dispose of things, be they desks or credit applications, can opt to do whatever he feels like. He could shred the applications. He could dump them nearby. He could walk away and let the building's owner take care of them. What does he care? It's not as if he's gonna get fired. Also, proper disposal requires either time, money, or both. A bankrupt company doesn't have money. It may have time, assuming people are going to stick around, but chances are their shredder has been seized by creditors. People are not going to stick around to shred things by hand, literally. Aren't there any laws regulating this? Apparently, such issues are covered by FACTA, the Fair and Accurate Credit Transactions Act, and although its guidelines require that "businesses to dispose of sensitive financial documents in a way that protects against 'unauthorized access to or use of the information'" [msnbc.com], it stops short of requiring the physical destruction of data. I'm not a lawyer, but perhaps there's enough leeway in the language for one to go around dropping sensitive documents in dumpsters? Like I mentioned before, inappropriate disposal of sensitive documents has been going on forever; I'm pretty sure this has been a problem since the very first mortgage was issued. My personal belief is that most companies would act responsibly and try to properly dispose of such information. But, this may prove to be a point of concern as well because of widespread misconceptions of what it means to protect data against unauthorized access. What happens if a company that files for bankruptcy decides to sell their company computers to pay off creditors? Most people would delete the information found in the computer, and that's that-end of story. Except, it's not. When files are deleted, the actual data still resides in the hard disks; it's just that the computer's operating system doesn't have a way to find the information anymore. Indeed, this is how retail data restoration applications such as Norton are able to recover accidentally deleted files. Some may be aware of this and decide to format the entire computer before sending it off to the new owners. The problem with this approach is the same as deleting files: data recovery is a cinch with the right software. Some of them retail for $30 or less-as in free. So, the sensitive data that's supposed to be deleted can be recovered, if not easily, at least cheaply-perhaps by people with criminal interests. Am I being paranoid? I don't think so. I've been tracking fraud for years now, and I can't help but conclude that the criminal underworld has plenty of people looking to be niche operators, not to mention that there are infinitesimal ways of defrauding people (look up "salad oil" and "American Express," for an example). An identification theft ring looking to collect sensitive information from bankrupt mortgage dealers wouldn't surprise me, especially in an environment where such companies are dropping left and right. The economics behind it make sense as well. A used computer will retail anywhere from $100 to $500. The information in it, if not wiped correctly, will average many times more even if you factor in the purchase of data recovery software. Criminals have different ways of capitalizing on personal data, ranging from selling the information outright to engaging in something with better returns. Is there a better way to protect oneself? Whole disk encryption is a way to ensure that such problems do not occur: One can just reformat the encrypted drive itself to install a new OS; the original data remains encrypted, so there's no way to extract the data. Plus, the added benefit is that the data is protected in the event that a computer gets lost or stolen. However, commonsense dictates that encryption is something ongoing concerns sign up for, not businesses about to go bankrupt. My guess is that sooner or later we'll find instances of data breaches originating from equipment being traced back to bankrupt mortgage dealers. The stock market is in a tumult. Actually, it has been for about a year, ever since the subprime fiasco (anyone take a look at Moody's performance over the past year?) Now that that particular issue has been beaten to death, other mortgagerelated issues are cropping up. Most of the stuff covered in the media is financial in nature, but some of those mortgagerelated issues do concern information security. It's no secret that there are plenty of companies in the US that discard sensitive documents by dumping them unceremoniously: leave it by the curb, drive it to a dumpster, heave it over the walls of abandoned property, and other assorted mindboggling insecure practices. In fact, MSNBC has an article on this issue, and names numerous bankrupt mortgage companies whose borrowers' records were found in dumpsters and recycling centers. The information on those documents include credit card numbers and SSNs, as well as addresses, names, and other information needed to secure a mortgage. Since the companies have filed for bankruptcy and are no more, the potential victims involved have no legal recourse, and are left to fend for themselves. In a way, it makes sense that companies that have filed for bankruptcy are behaving this way. (Not that I'm saying this is proper procedure.) For starters, if a company does wrong, one goes after the company; however, the company has filed for bankruptcy, it is no more, so there's no one to "go after." In light of the company status, this means that the actual person remaining behind to dispose of things, be they desks or credit applications, can opt to do whatever he feels like. He could shred the applications. He could dump them nearby. He could walk away and let the building's owner take care of them. What does he care? It's not as if he's gonna get fired. Also, proper disposal requires either time, money, or both. A bankrupt company doesn't have money. It may have time, assuming people are going to stick around, but chances are their shredder has been seized by creditors. People are not going to stick around to shred things by hand, literally. Aren't there any laws regulating this? Apparently, such issues are covered by FACTA, the Fair and Accurate Credit Transactions Act, and although its guidelines require that "businesses to dispose of sensitive financial documents in a way that protects against 'unauthorized access to or use of the information'" [msnbc.com], it stops short of requiring the physical destruction of data. I'm not a lawyer, but perhaps there's enough leeway in the language for one to go around dropping sensitive documents in dumpsters? Like I mentioned before, inappropriate disposal of sensitive documents has been going on forever; I'm pretty sure this has been a problem since the very first mortgage was issued. My personal belief is that most companies would act responsibly and try to properly dispose of such information. But, this may prove to be a point of concern as well because of widespread misconceptions of what it means to protect data against unauthorized access. What happens if a company that files for bankruptcy decides to sell their company computers to pay off creditors? Most people would delete the information found in the computer, and that's that-end of story. Except, it's not. When files are deleted, the actual data still resides in the hard disks; it's just that the computer's operating system doesn't have a way to find the information anymore. Indeed, this is how retail data restoration applications such as Norton are able to recover accidentally deleted files. Some may be aware of this and decide to format the entire computer before sending it off to the new owners. The problem with this approach is the same as deleting files: data recovery is a cinch with the right software. Some of them retail for $30 or less-as in free. So, the sensitive data that's supposed to be deleted can be recovered, if not easily, at least cheaply-perhaps by people with criminal interests. Am I being paranoid? I don't think so. I've been tracking fraud for years now, and I can't help but conclude that the criminal underworld has plenty of people looking to be niche operators, not to mention that there are infinitesimal ways of defrauding people (look up "salad oil" and "American Express," for an example). An identification theft ring looking to collect sensitive information from bankrupt mortgage dealers wouldn't surprise me, especially in an environment where such companies are dropping left and right. The economics behind it make sense as well. A used computer will retail anywhere from $100 to $500. The information in it, if not wiped correctly, will average many times more even if you factor in the purchase of data recovery software. Criminals have different ways of capitalizing on personal data, ranging from selling the information outright to engaging in something with better returns. Is there a better way to protect oneself? Whole disk encryption is a way to ensure that such problems do not occur: One can just reformat the encrypted drive itself to install a new OS; the original data remains encrypted, so there's no way to extract the data. Plus, the added benefit is that the data is protected in the event that a computer gets lost or stolen. However, commonsense dictates that encryption is something ongoing concerns sign up for, not businesses about to go bankrupt. My guess is that sooner or later we'll find instances of data breaches originating from equipment being traced back to bankrupt mortgage dealers.