- Packages and namespaces
- Optional static typing
- Generators and iterators
- Destructuring assignment (likely)
- JSON Encoding/Decoding
Not to mention performance improvements as a consequence of the optional static typing.
This just makes the case stronger for bringing business logic to the browser and getting rid go all those annoying get or post parameter based web applications. I mean seriously, if an architect suggested building a desktop thick client where the controllers and models were only on the server and the UI communicated with the controller by passing strings to it to trigger state changes in the model, he'd be considered officially insane. But the vast majority of state interaction type web applications (those with complex domain models) use such an architecture and nobody considers it odd.
Bottom line - once ECMAScript 4 is out and browsers start supporting it, all the 'thick clients are dead, long live the browser' weenies finally have a case. But only because the browser would've stopped being thin.
Isn't the browser made into a 'thin' client by leaving all the computation on the server?
And I believe the reason nobody has questioned this model is because of the relative secureness of the server compared to the browser. e.g. if you implemented security checks (for online banking for instance) in the browser, someone with a hacked version of a browser, or even a firefox with a clever extension, could immediately get access to privileged information.
Yes, the browser was made 'thin' by leaving the computation on the server. But for a certain class of applications, where the sole advantage of using a browser was easing deployment pains, this was no advantage. A lot of enterprise applications fall in to this category.
Gmail, Google Calendar and Google Docs currently use a model similar to what I've described. So it's already out there, it's just not mainstream yet is all.
I never meant to imply that there will be no adjustments in design or compromises when bringing stuff to the client. Of course, online banking would never have security checks on the client. There would always be a secure third party to do the verification, so that does not change. That's how you'd do it even when developing for the desktop, so the same design principles apply. If someone can still hack this then they could just as easily hack the https form submission used today - since in either case that information has to be sent to the server.
Oh I see what you're getting at now. I think this is probably why the flash clones (AIR, Silverlight and JavaFX) are coming forward now to steal attention before ECMAScript 4 can take off, which could end up being a big problem for its adoption.
I hadn't thought of that! But you're right and I couldn't agree more.
Groovy has all the features you mention for ECMAScript 4.
Grails doesn't run on browsers. I'm trying to say that we now have a viable browser based programming language (at least once Gecko and IE become ECMA 4 compliant).
And Blogger needs to change their ' This post has been removed by the author.' to 'This post has been removed by its author.' message.
Right now it looks like I'm censoring comments on my posts. Which I'm not :-).
This is not client side! Yegge is using this on the server. It will not even run in a browser as it requires Rhino specific features. It has nothing to do with client side scripting AT ALL.
I wouldn't consider an architect insane for creating an architecture that locks the domain model into the server. I would call it a good architecture. Passing domain objekts between client and server is bad design, and I'm talking from my own experience. I just finished a large app with a smart client and we chose to expose the domain model to the client. Managing state on the client AND the server gives you a lot of unwanted problems and helps you nothing.
@gernot - If you locked your models completely on the server, then it doesn't meet even the marketing definition of a smart client (I'm assuming this was a .Net project? Smart Client is MS marketing jargon for MS SOAP RPC web services combined with DataSets to ensure that the client can still be used even if it is no longer connected to the server... more a marketing pitch and less a technical design)
I've been on or seen multiple projects successfully delivered by my colleagues using both thick clients as well as browsers where models were on the client. We've used .Net WinForms, Java SWING, the GWT... never had a problem.
Google has gmail, docs and calendar which do the same. They don't seem to have a problem either. From my experience, I found these applications to have a richer UI and be faster, more responsive, and easier to modify and refactor.
Bluntly, if you can't manage models on both the client and the server (remember the client - server design which was all the rage before web apps became big?)- something that was being done successfully in Smalltalk almost two decades ago and on Java SWING and VC++ MFCs a decade ago, then that's when you have a bad design.
what an awesome 20% project .steve yeggie the man 8 ) !
Turns out it wasn't a 20% project and John Lam got it wrong when he assumed it was all done by Steve. See Steve's post on the topic.
This is just one part of a mainstream Google project.
Looks like it isn't going open source anytime soon either.
sounds interesting ..looking forward to see how this competes with Flex et al.
Post a Comment