10 things IT needs to know about Ajax

The introduction of any new Web technology will affect a network's infrastructure in ways that range from inconsequential to earth shattering. Ajax is one of the more disruptive new Web technologies traveling across networks today. To help you minimize future surprises on your network, we've outlined the 10 things you should take to heart about Ajax.


See a slide show of 10 tips for deploying Ajax applications.


1) Ajax is an idea, not an acronym

While Ajax commonly is spelled out as Asynchronous JavaScript and XML, the full name is not entirely appropriate because it oversimplifies the history of the technology and the implementation options that lie at its heart. More exactly, Ajax encompasses the idea that Web applications can be built to opt out of the typical post-wait-repeat cycle used in server-side-focused Web applications. Ajax lets Web applications move to a more responsive, continuous, but incremental style of updating. Ajax provides users a richer, more interactive way of experiencing the underlying Web application. This goodness for the user might mean that more monitoring and security oversight might be required of network professionals, as well as, potentially, server and network alterations.

2) It's really all about JavaScript

Ajax applications are written in JavaScript and usually rely on the XMLHttpRequest object for communications, which is making its way through the World Wide Web Consortium process. Because, like many Web technologies, it now is only an ad hoc industry standard, notable differences can be found in various browsers' implementations of it. It's also possible to use other data transport mechanisms — with and without widespread industry support — with Ajax applications, including traditional frame and image-cookie methods, as well as the use of binary bridges to Flash or Java.

Regardless of the transport approach used by the developer, Ajax has raised JavaScript to a more important position within a Web application than it previously held. JavaScript now is responsible for important data-collection, communication and consumption duties, so it no longer can be treated as a second-class Web technology without serious repercussions.

Developers who think the JavaScript technology is toxic can try to avoid the language by having a tool or framework generate it from some other language like Java (Google Web Toolkit, for example), or hide the code behind components or tags (such as with .Net or Ruby). At the end of the day, however, JavaScript still will be in the application. It's better to understand the language and embrace it directly, because if you are going to use Ajax, you ultimately are using lots of JavaScript.

Ajax is intertwined with the network, so bad code is going to mean lots of troubleshooting by network administrators, as well as developers: The bottom line is to encourage good, network-aware coding! The same organizational "rules and tools" -- coding standards, testing regimes and source-code control — that are in place for other languages must be applied to JavaScript to ensure that Ajax applications are supportable and robust.

3) XML is not required

Despite the "x" in the acronym, Ajax does not require XML. The XMLHttpRequest object can transport any arbitrary text format. For many Ajax developers, JavaScript Object Notation or even raw JavaScript code fragments make more sense as a data format, given that JavaScript is the consuming environment. For direct input into documents, other developers may favor raw text or HTML fragments. Still others will use such data formats as the less-known YAML markup language or such old standbys as comma-separated values.

Of course, it is possible and certainly reasonable to use XML, but it is far from required. Using binary formats for uploading files is not supported yet by the XMLHttpRequest object, but considering that Flash uses a binary format called Action Message Format, it is likely that similar features will be found in Ajax applications soon enough. You should know which format is being passed around the network, because it isn't always XML. Also, make sure you can analyze the format for performance and security.

4) Plan for an increase in HTTP requests

The most obvious issue for the network administrator supporting Ajax applications is that the architectural programming pattern has changed the network utilization of Web applications from a batch-like, somewhat infrequent response of a few hundred kilobytes, to a more continuous exchange of smaller HTTP responses. This means that network-bound Web and application servers may find themselves even busier than before. What Ajax will do to your server and network utilization certainly will depend on how the application is built — make sure your developers understand the network impact of their applications.

5) Optimize Ajax requests carefully

Web applications should adhere to the network delivery principle of sending less data, less often. That doesn't mean that this principle is widely followed by developers, however. Fortunately for the network, HTTP compression of Ajax responses can reduce response size and is supported in all modern browsers. Because of dynamic compression's overhead, however, speed may not improve much if responses are indeed relatively small. This means that it would be wise for network administrators to turn on compression on their Web server, but they need to understand that with Ajax applications, their gains won't be as big as with traditional Web applications.

To send data less often, we generally would employ caching. Most Ajax implementations can be openly hostile to caching, however, given certain assumptions made by browsers regarding not re-fetching URLs during the same session. Rather than work with caching, many Ajax developers will work aggressively to defeat caching via the header setting or URL uniqueness.

It is possible to address caching concerns with a client-side Ajax cache written in JavaScript, but most Ajax libraries do not implement such a feature. Network professionals should show developers the benefit of caching, because Ajax probably will benefit more from that than from compression.

6) Acknowledge the two-connection limit

Ajax applications are limited by HTTP to two simultaneous connections to the same URL. This is the way the HTTP protocol is designed, not some browser bug or limitation. The good news is that it keeps many Ajax developers from swamping a server accidentally, though Microsoft's Internet Explorer 8 is supposed to go well beyond the limit. Chatty Ajax applications can be trouble, and with browsers changing the rules, network administrators need to keep a close eye on the number of requests made, and work with application developers to avoid employing such design patterns as fast polling or long-held connections.

7) Watch out for response ordering

With traditional Web applications, the network effects of TCP/IP communications — such as the lack of order in which individual HTTP responses are received — generally are not noticed by developers or users. The base unit, the HTML document, is received before other objects, and it then triggers the request. Any subsequent request triggers a whole new base document, thereby guaranteeing order. Ajax takes such implicit ordering away, however, so that an application dependent on proper sequencing requires a response queue. Ajax frameworks, however, are not consistent in acknowledging this network concern. So, again, make sure Ajax application developers understand such network-level concerns.

8) Acknowledge the effects of eliminating "Layer 8" error correction

For years, users have been correcting Web-delivery quality by reloading pages or pressing the Back button. Simply put, users doing this help mitigate network problems because errors occur generally at expected moments between page paints. With Ajax, however, application failure is no longer that obvious. Worse yet, users often are misinformed about errors, because the simple, animated-GIF spinning circle provides little information about the true status of the request.

Developers are at a loss because many libraries aren't effective at acknowledging that timeouts happen, retries must occur, and server and data errors crop up. JavaScript diagnostics showing communication and code errors are rarely in place on the client side, so blissful ignorance is the norm. More application-level monitoring is required for administrators to support Ajax properly.

9) Old security threats get a second exposure

If you listen to the pundits, Ajax may appear to produce more attack surface, but it really isn't any less secure than traditional Web-application development environments, because the HTTP inputs to the trusted server side are the same — headers, query string and message body. If implicitly trusting client-side code and entered data is not verboten already in your Web development group, however, Ajax may push things in that direction.

Cross-site scripting (XSS) isn't a vulnerability new with Ajax; it is just more common, especially if an application allows state data to be manipulated with JavaScript. HTML input should be disallowed in most cases, and HTTP Only Cookies should be applied immediately to reduce cookie hijacking and other attacks via XSS.

Cross Site Request Forgery likewise isn't new with Ajax, but if your application developers aren't checking the HTTP Referer (sic) header and managing sessions properly within Ajax applications, you've already been open to it, although it might be worse now.

Hackers, like developers, now are more interested in using and abusing JavaScript, which increases the potential for exploits. Network professionals should make sure developers are aware that client-side code can be manipulated even with obfuscation in place, so data inputs should always be filtered and sanitized, Ajax or not.

10) Abide by same origin for your protection

On the positive side of security, JavaScript's same-origin policy (SOP) is fully enforced in an XMLHttpRequest-based Ajax application. This policy makes sure that scripts cannot talk to domains outside of those from which they are issued. From the developer's point of view, this can be quite annoying because it means that pages served, for example, from ajaxref.com can't talk to a URL hosted on www.ajaxref.com; even if it is the same machine, it isn't the same exact domain. DNS equivalency doesn't matter here; it is a string-check employed by the SOP.

The SOP will severely hamper a developer's ability to perform some Web-service efforts on the client side as well. Clearly the best approach is to use a proxy on the server to bounce requests to other servers and combine the results. However, many Ajax developers attempt to break the same-origin restrictions. Using the <script> tag as a transport instead of the XMLHttpRequest object introduces dangerous trust assumptions, and that leads to the origin of much of the concern about overall Ajax security.

Now, with such browsers emerging as Firefox 3 and Internet Explorer 8 employing native cross-domain request facilities, there is certain to be more trouble on the horizon. As is the case with Java's security-sandbox concept, SOP restrictions are introduced just to keep developers from destroying security. Go around such safeguards with extreme caution.

Watch what you wish for

With Ajax, rich-application widgets will win a project, but bad plumbing may sink it. If the promise of a rich Ajax application is delivered in a network environment that is occasionally fragile, users will become disillusioned with the perceived instability of the application regardless of its slick interface. To enable desktop-like quality, network professionals must educate Ajax developers about certain network and security fundamentals and provide a solid and constantly monitored delivery platform that includes client-side diagnostics on JavaScript functioning and network performance from the user perspective. Users regularly see rich Web applications done right — like those coming from Google, for example — so anything less is a risky endeavor.

Powell is the founder of PINT, a San Diego Web development and consulting firm and the author of the recently published Ajax: The Complete Reference (ajaxref.com). He is also a member of the Network World Lab Alliance and can be reached at tpowell@pint.com.

source : networkworld.com

Technorati Tags: ,
| Continue Reading..

First Beta of Gentoo 2008.0 Released

Delayed one month from its original release schedule, Gentoo Linux 2008.0 Beta 1 makes its appearance into the Linux world last night, April 1... and NO, it's not a joke. "Since this is the first beta, we're looking only for bugs in functionality, not bugs in appearance such as desktop backgrounds or other artwork. We expect to release a second beta once your testing has helped us fix problems with this first beta. [...] A migration to RPM was carefully considered again for this release, but in the end we decided to wait for the few remaining RPM-using distributions to migrate to the superior packaging format of ebuilds. [...] Your support and enthusiasm are greatly appreciated—thank you. - says Donnie Berkholz in the release announcement.

Powered by the lightweight Xfce 4.4.2 desktop environment and the powerful Portage software management system, Gentoo 2008.0 Beta 1 Live CD comes packed with a lot of new software and improvements over previews versions. Take a look at the screenshots below!

Review imageReview image
Review image
Review image

Review image
Review imageReview image

Below is the Gentoo 2008.0 Release Schedule:
April 3 2008 - Beta 1 release
April 14 2008 - Final upload
April 17 2008 - Gentoo 2008.0 release

And, as you can see from the above release schedule, in about two weeks, the final release will be available to general public! But a second beta release is also expected, as Donnie Berkholz said in the release announcement.

Remember that this is a beta release and it should NOT be installed on production machines. It is intended to be used for testing purposes only. Be sure to read all the above issues and report new ones to the Ubuntu Bug Tracker
What is Gentoo Linux?

Gentoo is a source distribution. While there are various "stages," which have different levels of already compiled binary packages included, the main idea is building every package from source code. There are GRP packages, which reduce the need of compiling certain packages that take a very long time to compile.

The creation of this distribution was inspired by the ports system of FreeBSD, there’s even an experimental stage with a FreeBSD kernel and the Gentoo "portage" called Gentoo/FreeBSD.

Download and test Gentoo Linux 2008.0 right now from Softpedia.

source : news.softpedia.com

Technorati Tags: ,
| Continue Reading..

10 stories that could be April Fools... but aren't

It's here again, the day when jokers set out to make fools of the rest of us. But not every bizarre story is a hoax. Here is a round-up of some of the day's seemingly spoof news stories which are actually true (and one that isn't).

1. A new pay-per-view funeral service scheme is being launched today. The Daily Mail says the scheme at Southampton Crematorium allows mourners to grieve from home by watching proceedings online.

2. A turtle is addicted to nicotine. He became addicted after picking up the smouldering butts in his owner's garden, in Kouqian, China, and sulks if he doesn't get his fix. The Daily Express, which picked up the story from Chinese news agency Xinhua, includes a gob-smacking picture of the turtle doing a rather good impression of Dot Cotton.

3. The menopause is caused by the age-old battle between wives and mothers-in-law, reports the Times. As long as 50,000 to 300,000 years ago, competition for food in a family unit was a battle won by the younger women who fed their offspring, which led to the older women losing their ability to breed. With food hard to find, mothers-in-law tended to help rear the grandchildren rather than have more children themselves.

4. An injection that allows women bigger and better orgasms by increasing the size of the mysterious G-spot is being launched in the UK, says the Sun. The £800 collagen jab takes less than half-an-hour and is given under local anaesthetic.

5. School desks and chairs are to be enlarged to meet the needs of the UK's ever-heavier schoolchildren, reports the Express. On average British children are a centimetre taller than they were 10 years ago, and there are more obese youngsters, so desks supplied to UK schools will reflect this.

Baby crocodile

Stolen from a busy aquarium

6. Wind turbines or solar panels built by UK companies anywhere in the world could count towards Britain's renewable energy targets under controversial government proposals, according to the Financial Times.

7. You will soon be able to have a tattoo on your teeth, reports the Sun. Steve Heward, the dentist who started the craze in the US plans to set up in Britain.

8. The traditional Chinese martial art T'ai Chi can help control diabetes, reports the Daily Mail. Apparently, researchers have found the flowing movements and deep breathing involved can result in a fall in blood sugar levels.

9. A thief walked out of a busy Norwegian aquarium with a crocodile that was over two feet long, says the Independent.

10. Drinkers have been banned from calling barmaids "love". An outraged Daily Star says new discrimination laws mean landlords that allow punters to chat up staff could be hauled before a tribunal and face unlimited fines.

And finally, a genuine spoof. Have you heard the one about the penguins that can fly? A BBC camera crew filming a colony of Adelie penguins were astonished when they did something "no other penguins can do" and took to the Antarctic skies. Go to BBC iPlayer to see the "footage".

source : news.bbc.co.uk

Technorati Tags: ,
| Continue Reading..

Microsoft wins document format standards battle

Microsoft Corp (MSFT.O: Quote, Profile, Research) has won a battle to have a key document format adopted as a global standard, improving its chances of winning government contracts and dealing a blow to supporters of a rival format.

The OpenDoc Society, which had argued Microsoft's Office Open XML (OOXML) format was unripe for ratification by the International Organization for Standardization (ISO), published the results showing Microsoft's win on its Web site.

Microsoft welcomed the decision, which was leaked on Tuesday ahead of an official ISO statement expected on Wednesday, saying it created a "level playing field" for OOXML to compete with other standards.

Supporters of rival Open Document Format (ODF), which is already an ISO standard and widely used, said multiple formats defeated the purpose of having standards and that the result would help Microsoft tighten its grip on computer users.

Tom Robertson, Microsoft's head of interoperability and standards, said: "Open XML joins the ranks of PDF, HTML and ODF among the ranks of document formats. I think it makes it easier for governments to offer users choice."

"The control over the specification now moves into the hands of the global community. This is going to be one of the most, if not the most important document format around the world for years to come," he added in a phone interview.

James Love, director of Knowledge Economy International, which campaigns for fairer access to knowledge, told Reuters: "We are disappointed."

"Microsoft's control over document formats has destroyed competition on the desktop, and the fight over OOXML is really a fight over the future of competition and innovation."

Microsoft, shepherded through a fast-track ISO approval process by European standards organization Ecma, lost a first ISO vote in September. Under the process, a second vote was allowed after a so-called ballot resolution meeting last month.

In the second voting period that closed on March 29, Microsoft won the approval of 86 percent of voting national bodies and 75 percent of those known as P-members. A two-thirds majority of the P-members was required.

Among those voting in favor of OOXML were the United States, Britain, Germany and Japan, according to the OpenDoc Society list. Opponents included China, India and Russia.

The process tested ISO to its limits as national bodies waded through the 6,000 pages of code that define OOXML, then dealt with more than a thousand points of order at the ballot resolution meeting, which was designed to help reach consensus.

ODF has just 860 pages of code, one of the reasons that many experts argue that translation between the two is too incomplete to allow true interoperability -- a concept that Microsoft has recently publicly embraced.

Michiel Leenaars, who is on the OpenDoc Society board and chaired the Dutch committee in the first stage of the ISO process, said OOXML was not ready to be an international standard and that the 15-month ISO process had been too fast.

"It was mission impossible," he told Reuters by phone. "The process wasn't meant for this type of thing."

(Editing by David Holmes and Braden Reddall)

source : reuters.com

Technorati Tags:
| Continue Reading..

ASUS GeForce 9800 GTX review

Just last week we brought you a review on the GeForce 9800 GX2, and today we have a new ASUS board in our test labs. Powered by the GeForce 9800 GTX chip, we put the brand new chip to the test.

Today will see the official launch of the GeForce 9800 GTX as the third member of the GeForce 9 series. Already we have seen the GeForce 9600 GT performing in the mainstream sector, the 9800 GX2 aimed at the high-end $500+ market, and now the 9800 GTX which sits right in-between. There is also a GeForce 9800 GTS on the way, but we will leave that for another occasion.

But first things first, and without getting fooled by naming conventions, you should know that Nvidia has not gone back to the drawing board with the new GeForce 9 series, meaning that without delving too deep you can find that the GeForce 9 is still based on the same GeForce 8 architecture. We believe Nvidia has done this for two reasons: First, they successfully proved with the GeForce 9800 GX2 that a lot of power was still left untapped. The very expensive dual-GPU card recently demonstrated that it has more than enough power to churn through the latest and most demanding gaming titles. Furthermore, with little pressure from ATI in the high performance segment, Nvidia has been able to reclaim their title without having to develop an entirely new architecture.

The other reason has to do with money, and as any successful business Nvidia has got its own selfish motivations. Given that the original GeForce 8 series cost them something in the range of $400 million dollars to develop, the graphics giant sure is trying to juice it for every last penny possible. Nvidia has effectively kept the G80 architecture alive by shrinking it down from 90nm to 65nm to improve efficiency.

Based on the infamous G92 architecture that we have seen used time and time again, the GeForce 9800 GTX shares very similar specifications to that of the GeForce 8800 GTS 512 graphics card. By now, the G92 has been used on numerous Nvidia graphics cards including the GeForce 8800 GS, 8800 GT, 8800 GTS 512, and the 9800 GX2. Keeping this in mind, it will be interesting to see what makes the new GeForce 9800 GX2 special.

The GeForce 9800 GTX is meant to replace the old 8800 GTX, which is no longer in production as it was rendered impractical by the 8800 GTS 512, which offers similar performance at a fraction of the cost. The new 9800 GTX is said to begin retailing in the $300 to $349 price range, which would place it in a league of its own as there are currently no ATI or Nvidia graphics cards competing there.

The specifications of the GeForce 9800 GTX and the 8800 GTS 512 are indeed very similar, though a feature that is unique to the 9800 GTX is 3-way SLI. As you may have read in our previous reviews, we are not really that excited by SLI (or Crossfire) as we are for truly competent single card solutions. After all, it is likely that you would end up spending well over $1000 for this kind of setup with the 9800 GTX, and the performance scaling is questionable.

ASUS provided us with their version of the GeForce 9800 GTX which is the one going out on retail although for the most part it is built closely to match Nvidia’s reference design. Let’s move on to check this new graphics card out in more detail...

source : techspot.com

Technorati Tags: ,
| Continue Reading..

IE8 Beta 1 and IE7 Tips and Tricks

With approximately 80% of the browser market focused on iterations of Internet Explorer, web content is tailored by default on Microsoft's proprietary browser, even in the detriment of rival products such as Firefox or Opera. The intimate connection between browsers and the operating systems they are associated with is the main reason why content developers have to adapt their materials to both Internet Explorer 6 and Internet Explorer 7 running on Windows XP and Windows Vista. And even as Microsoft is moving forward with the building

of Internet Explorer 8, adopting a new development strategy which places a strong emphasis on standards support, due to the slow migration from IE6 to IE7, the company is still pushing a bag of tips and tricks for the latest version of IE.

Internet Explorer 7 'Tips and Tricks' is a document designed to highlight some of the "common deployment issues with Internet Explorer 7, along with solutions and workarounds," according to Microsoft. Ultimately, the resource is meant to increase compatibility of both web content and browser applications with Internet Explorer 7. At well over a year and a half since the delivery of IE7, IE6, although having given up a lion's share of its install base to its successor, is still a much stronger presence than Firefox. This scenario reveals the fact that the natural upgrading process from IE6 to IE7, streamlined by the distribution of Internet Explorer 7 through Automatic Updates is lagging behind due to the difficulties of tailoring Internet Explorer 6-centric content and applications to its successor. Internet Explorer 7 'Tips and Tricks' is available for download here.

At the same time, Microsoft is looking at the perspective of facing similar issues with Internet Explorer 8. This is why at the beginning of March Internet Explorer 8 Beta 1 was kicked into gear, and aimed at web content developers and designers. The Redmond company is trying nothing more than to get an early start for the introduction of IE8, by getting developers to prepare their content for the next version of Internet Explorer. Microsoft is offering resources set up to enable the tweaking of both add-ons and websites for IE8, as well as developers’ white papers, hands-on labs for new features and Internet Explorer 8 Beta 1 technology overview. Just another bag of tips and tricks available through the Internet Explorer 8 Beta 1 Readiness Toolkit.

Internet Explorer 7 is available for download here.
Internet Explorer 8 Beta 1 is here.

source : news.softpedia.com

Technorati Tags: ,,,
| Continue Reading..

Java IDE adds Application Factories for code reuse

With a release of the JBuilder 2008 IDE for Java development on Tuesday, CodeGear is adding a capability called Application Factories, which enables code reuse and makes it easier to update applications.

Application Factories features a methodology and tools to navigate framework choices, open source, internal code, and deregulated technology standards to determine use and reuse, CodeGear said. Developers can communicate intent, capture instructions and recommendations, and point to resources.

"The methodology [behind Application Factories] is really based on supporting application development from an applications perspective," said Michael Swindell, vice president of products and strategy at CodeGear.

"We create an IDE within JBuilder 2008, which is really intended for a continuous round-trip development cycle," he said.

Information is captured about how and why an application was created. Developers can attach scripts and tags.

CodeGear's Application Factories concept differs from Microsoft's similarly named Software Factories, said analyst Vishwanath Venugopalan of The 451 Group.

"Software Factories encapsulates patterns of usage and reuse at the source code level, whereas Application Factories capture patterns in how the development tool itself is used," Venugopalan said. "In other words, Application Factories encourages capturing and reuse of development best practices rather than architecture and design patterns."

With Application Factories, a generic IDE becomes an application-specific IDE, CodeGear said. Reusable code is delivered as modules that contain code and knowledge about purpose and design. Developers can choose from modules stored in a metadata repository. Or developers can create their own modules that can be stored in the repository for later reuse.

Based on the Eclipse 3.3 platform and Eclipse Web Tools Platform 2.0, JBuilder 2008 also features an Instantiations Swing Designer, providing a visual layout tool to help developers construct Swing-based GUIs.

With JBuilder 2008, CodeGear is offering a commercial IDE that adds value on top of the base, free Eclipse IDE. For some developers, the base Eclipse IDE is sufficient, but others want more, Swindell said.

"There's also a very large number of customers who want to know that they can get enterprise-level support, that they can get regular product updates, that they can get developer support, and those things all come in with a commercial product," he said. Capabilities such as Application Factories will not be found in the base Eclipse IDE, Swindell said.

"This release of JBuilder should be better able to hold its own against the free Eclipse IDE than the previous one because it contains Application Factories and the well-regarded Swing Designer under partnership with Instantiations," Venugopalan said.

Other features in JBuilder 2008 include:

* Struts 1, Struts 2, and JavaServer Faces support
* Updated support for commercial and open source Java application servers
* Collaboration and team development with TeamInsight and ProjectAssist
* Code coverage, memory, and CPU profiling
* Thread debugging and request analyzer
* UML (Unified Modeling Language)
* Code archaeology, featuring tools for mining of application, focused on source code
* Code metrics and audits
* Developer support for Java Platform Enterprise Edition 5 (Java EE 5) and Java Development Kit 5

Supported on Windows XP (SP2), Windows Vista, Mac OS X (10.4/Tiger), and Red Hat Enterprise Linux 4, JBuilder is offered in Turbo, Professional, and Enterprise Editions.

The Enterprise product offers an enterprise-class Java IDE with Application Factories, collaboration and team development support, and UML modeling code archaeology. It costs $1,499, or $750 to upgrade.

The Turbo edition is a free product with basic features to build Java applications. Professional, priced at $499, or $250 to upgrade, adds expanded support for Java EE 5 and Web services as well as code profiling, performance-tuning tools, Swing capabilities, and basic UML modeling.

Paul Krill is editor at large at InfoWorld.

source : infoworld.com

Technorati Tags: ,,
| Continue Reading..

Enter your email address:

Delivered by FeedBurner

Followers