Related
We've got an Excel spreadsheet floating around right now (globally) at my company to capture various pieces of information about each countries technology usage. The problem is that it goes out, gets changes, but they're never obvious, and often conflicting - and then we have to smash them together. To me, the workbook is no more than a garbage in/garbage out type application waiting to be written.
In a company that has enough staff and knowledge to dedicate to Enterprise projects, for some reason, agile and language/frameworks such as Rails, Grails, etc. are frowned upon. That said, I can't help but think that this is almost a perfect fit for the need, given the scaffolding features for extremely simple implementations of capturing raw fields with only a couple lookups (i.e. a pre-defined category). I'm thinking this would be considered a very appropriate use of these frameworks.
Has anyone worked on these types of quick and dirty apps before in normally large-scale, heavy-handed enterprise environments with success? Any tips for communicating this need/appropriateness to non-technical management?
The only way to get this implemented in a rigid organization is to get this working and demo it -- without approval. It's very hard for management to say no to a finished project.
I work for a really big company & have written many utility apps based on Rails (as well as contributed to some larger Rails projects). That said, the biggest concern is not the quality of the app, but who's going to support/maintain it when you leave or get hit by the bus.
IMHO, The major fear that an enterprise organization has - especially if the application becomes more critical to it's core business - is how to support it. If it doesn't fit into it's neat little box of supported technologies, it's less likely to happen.
Corporations have been bitten by this many times in the past & are cautious when bringing in new technology.
So, if you can drum up more folks to learn Ruby/Rails in your group (or elsewhere in your company), you may be able to make a good case for it. Otherwise, sad to say, your probably better off implementing something on Sharepoint :-(.
If you already have a Java infrastructure, then creating a Grails app will require little to no additional IT ramp up to support and maintain. The support and maintenance cost and effort should be the same as for a Java application (i.e. Grails apps run on Tomcat, use the same JVM, use the same diagnostic/profiling tools, etc.).
In my experience, larger IT organizations have a harder time supporting Ruby when its not already in the toolchain because its a new language, new deployment environment, and requires a considerable amount of support and maintenance ramp up.
I would develop a minimal viable product, then make friends with someone in IT who can help you deploy it into a staging or production environment. Then get a few of the users to hop on board and test it like its a Beta product. After that, open it up to a larger audience.
So as others have said, forgiveness over permission, but be smart about the impact on the IT organization.
How can i write a cloud-aware application? e.g. an application that takes benefit of being deployed on cloud. Is it same as an application that runs or a vps/dedicated server? if not then what are the differences? are there any design changes? What are the procedures that i need to take if i am to migrate an application to cloud-aware?
Also i am about to implement a web application idea which would need features like security, performance, caching, and more importantly free. I have been comparing some frameworks and found that django has least RAM/CPU usage and works great in prefork+threaded mode, but i have also read that django based sites stop to respond with huge load of connections. Other frameworks that i have seen/know are Zend, CakePHP, Lithium/Cake3, CodeIgnitor, Symfony, Ruby on Rails....
So i would leave this to your opinion as well, suggest me a good free framework based on my needs.
Finally thanks for reading the essay ;)
I feel a matrix moment coming on... "what is the cloud? The cloud is all around us, a prison for your program..." (what? the FAQ said bring your sense of humour...)
Ok so seriously, what is the cloud? It depends on the implementation but usual features include scalable computing resource and a charge per cpu-hour, storage area etc. So yes, it is a bit like developing on your VPS/a normal server.
As I understand it, Google App Engine allows you to consume as much as you want. The back-end resource management is done by Google and billed to you and you pay for what you use. I believe there's even a free threshold.
Amazon EC2 exposes an API that actually allows you to add virtual machine instances (someone correct me please if I'm wrong) having pre-configured them, deploy another instance of your web app, talk between private IP ranges if you wish (slicehost definitely allow this). As such, EC2 can allow you to act like a giant load balancer on the front-end passing work off to a whole number of VMs on the back end, or expose all that publicly, take your pick. I'm not sure on the exact detail because I didn't build the system but that's how I understand it.
I have a feeling (but I know least about Azure) that on Azure, resource management is done automatically, for you, by Microsoft, based on what your app uses.
So, in summary, the cloud is different things depending on which particular cloud you choose. EC2 seems to expose an API for managing resource, GAE and Azure appear to be environments which grow and shrink in the background based on your use.
Note: I am aware there are certain constraints developing in GAE, particularly with Java. In a minute, I'll edit in another thread where someone made an excellent comment on one of my posts to this effect.
Edit as promised, see this thread: Cloud Agnostic Architecture?
As for a choice of framework, it really doesn't matter as far as I'm concerned. If you are planning on deploying to one of these platforms you might want to check framework/language availability. I personally have just started Django and love it, having learnt python a while ago, so, in my totally unbiased opinion, use Django. Other developers will probably recommend other things, based on their preferences. What do you know? What are you most comfortable with? What do you like the most? I'd go with that. I chose Django purely because I'm not such a big fan of PHP, I like Python and I was comfortable with the framework when I initially played around with it.
Edit: So how do you write cloud-aware code? You design your software in such a way it fits on one of these architectures. Again, see the cloud-agnostic thread for some really good discussion on ways of doing this. For example, you might talk to some services on GAE which scale. That they are on GAE (example) doesn't really matter, you use loose coupling ideas. In essence, this is just a step up from the web service idea.
Also, another feature of the cloud I forgot to mention is the idea of CDN's being provided for you - some cloud implementations might move your data around the globe to make it more efficient to serve, or just because that's where they've got space. If that's an issue, don't use the cloud.
I cannot answer your question - I'm not experienced in such projects - but I can tell you one thing... both CakePHP and CodeIgniter are designed for PHP4 - in other words: for really old technology. And it seems nothing is going to change in their case. Symfony (especially 2.0 version which is still in heavy beta) is worth considering, but as I said on the very beginning - I can not support this with my own experience.
For designing applications for deployment for the cloud, the main thing to consider if recoverability. If your server is terminated, you may lose all of your data. If you're deploying on Amazon, I'd recommend putting all data that you need persisted onto an Elastic Block Storage (EBS) device. This would be data like user generated content/files, the database files and logs. I also use the EBS snapshot on a 5 day rotation so that's backed up itself. That said, I've had a cloud server up on AWS for over a year without any issues.
As for frameworks, I'm giving Grails a try at the minute and I'm quite enjoying it. Built to be syntactically similar to Rails but runs on the JVM. It means you can take advantage of all the Java goodness, like threading, concurrency and all the great libraries out there to build your web application.
The nature of our business often has 2-3 remote developers working on a single project (mostly Rails), and each one currently has carte blanche access to source so they can checkout, run, and develop locally.
The problem is any one of them could ship the whole base out the back door. Overseas legal action seems futile.
I'm guessing the best way would be separation of duty type of strategy where a contractor only gets specific portions of code - but how can they run and test the full project?
I'm looking for advice, strategies, or even software solutions to mitigate this risk.
Thanks a ton.
You should really allow only trusted people to handle your family jewels. I can't think of any stronger sign of trust a software company can show than to give someone complete access to their source.
That being said, a few ideas come to mind.
If they're consultants, you should see if you can get some kind of business agreement with an entity in the remote country that can take care of local legal hassles for you. US companies with offices in India do this all the time.
Perhaps you can give access to non-important pieces of the software to the untrusted parties and have them work only on that? The unit testing can be done by them on the pieces but the integ. tests that require the entire system have to be done by you.
It might also be possible for them to use the 'important' parts of the code as a service from a server you provide rather than as modules locally. Admittedly, this requires some reengineering but it might be worth it.
The bottom line is what Stephen said. Low priced off-shore contractors come with certain liabilities. If you're not willing to accept that, you'll have to change your mode of working.
I don't think there's much you can do about it. Either take the risk, or don't use off-shore contractors.
But I'd balance this with an honest assessment of how valuable your code is to you and to a supposedly dishonest contractor. If it is really valuable, then you should be able to afford to take legal action to protect it ... even in a difficult legal environment.
Well, if you don't trust your developers enough to let them work with your code, then don't hire them.
I don't see any way you can meaningfully limit their access to code without seriously impacting their productivity. Even if you could compile some of the code, it's very useful to have access to the full source to understand problems and bugs.
At any rate, you may be overestimating the threat: Most kinds of piracy would be possible even with the binary, and without the infrastructure and customers your company provides, the stolen code is probably not worth much.
I'm a big fan of ruby on rails, and it seems to incorporate many of the 'greatest hits' of web application programming techniques. Convention over configuration in particular is a big win to my mind.
However I also have the feeling that some of the convenience I am getting is coming at the expense of technical debt that will need to be repaid down the road. It's not that I think ROR is quick and dirty, as I think it incorporates a lot of best practices and good default options in many cases. However, it seems to me that just doesn't cover some things yet (in particular there is little direct support for security in the framework, and plugins that I have seen are variable in quality).
I'm not looking for religious opinions or flamewars here, but I'd be interested to know the community's opinion on what areas Rails needs to improve on, and/or things that users of Rails need to watch out for on their own because the framework won't hold their hand and guide them to do the right thing.
Regardless of framework the programmer needs to know what she's doing. I'd say that it's much easier to build a secure web application using something as mature, well designed and widely adapted as Ruby on Rails than going without the framework support.
Take care with plugins and find out how they work (know what you do, again).
I love Rails too, but its important for us to understand the shortcomings of the framework that we use. Though it might be a broad topic addressing these issues wont hurt anyone.
Aside from security issues, one other big issue is DEPLOYMENT on Shared Hosts. PHP thrives in shared hosting environments but Rails is still lagging behind.
Of course most professional Rails developers know that their apps need fine-tuned servers for production and they will obviously deploy on Rails-Specific hosts.
In order for Rails to continue success the core team should address this issue, especially with Rails 3.0 (Merb +Rails) coming..
An example of this is simple: I have a bluehost account, and i noticed the Rails icon in my cpanel. I talked to the bluehost support and they said its more or less a dummy icon, and that it doesn't function properly.
Having said that any professional who wanted to deploy a Rails App would not use bluehost. , but it does hurt Rails, when hosts say that they support it and then users run into problems which their support know nothing about..
The article you refer to defines technical debt as
[the] eventual consequences of
slapdash software architecture and
hasty software development
With rails, any development that is not test driven incurs technical debt. But that is the case with any platform.
At an architectural level Rails provides some deployment challenges. A busy site must scale with lots of hardware or use intelligent caching strategies.
My advice to anyone adapting Rails would be to:
use TDD for all your development
verify the quality off any plugin
you use by reading its tests. If
they are not clear and complete,
avoid the plugin
read "Rails
Recipes" and "Advanced Rails
Recipes" (Advanced Rails Recipes has
a good recipe for adding
authentication in a RESTful way)
be prepared to pay for hardware to scale your site (hardware is cheaper than development time)
From my experience, by far the biggest tolls you end up paying with RoR are:
Pretty big default stack (not counting plugins you might be using)
Updating models tends to be a pain in the ass, at least in production servers.
Updating Rails or Ruby themselves is a bit more complicated than it should, but this differs depending on your server setup.
As ewalshe mentioned, deployment is sometimes a drag, and further down the road, should you require it, scaling gets a bit iffy, as it does with most development frameworks.
That being said, I'm an avid user of RoR for some projects, and with the actual state of hardware, even though you do end up paying some tech debt to using it, it's almost negligible. And one can hope these issues will be reviewed eventually and solved.
With any level of abstraction there is a bit of a toll you pay - genericized methods aren't quite as fast as those specific to something built just for your purpose. Fortunately though, it's all right there for you to change. Don't like the query plans that come out of the dynamic find methods? write your own, good to go.
Someone above put it well - hardware is cheaper than developers. I'd add "at a sufficiently low amount of hardware"
I'm reading Deploying Rails Applications and recommend it highly to answer your concerns.
The book is full of suggestions to make life easier, taking a deployment-aware approach to your Rails development from scratch, rather than leaving it to later.
I don't think the choice of RoR implies a technical debt but just reading the first few chapters alerted me to practices I should be following, particularly on shared hosts, such as freezing the core rails gems so you can't be disrupted by upgrades on the host.
The 30-page chapter on Shared Hosts includes memory quote tips such as using multiple accounts (if possible) with one Rails app per account. It also warns about popular libraries such as RMagick possibly pushing your memory size to the point where your processes are killed (such as a 100MB limit, which it suggests some hosts periodically apply).
I have a memory of talking to people who have got so far in using Ruby on Rails and then had to abandon it when they have hit limits, or found it was ultimately too rigid. I forget the details but it may have had to do with using more than one database.
So what I'd like is to know is what features/requirements fall outside of Ruby on Rails, or at least requires such contortions that it is better to use another more flexible framework, even though you may have to lose some elegance or write extra boilerplate code.
Rails (not ruby itself) is proud to be "Opinionated Software".
What this means in practice is that the authors of rails have a certain target audience in mind (themselves basically) and aim rails specifically at that. If X feature isn't needed for that target audience, it doesn't get added.
Off the top of my head, things that rails explicitly doesn't support that people may care about:
Foreign keys in databases
Connections to multiple DB's at once
SOAP web services (since rails 2.0)
Connections to multiple database servers at once
That said, it is very easy to extend rails with plugins, and there are plugins which add all of the above functionality to rails, and a lot more, so I wouldn't really count these as limits.
The only other caveat is that rails is built around the idea of creating CRUD web applications using MVC. If you're trying to do something which is NOT a CRUD web app (like twitter, which is actually a messaging system, or if you are insane and want to use a model like ASP.NET webforms) then you will also encounter problems. In this case you're better off not using rails, as you're essentially trying to build a boat out of bicycle parts.
In all likelihood, the problems you will run into that can't just be fixed with a quick plugin or a day or 2 of coding are all inherent problems with the underlying C Ruby runtime (memory leaks, green threads, crap performance, etc).
Ruby on Rails does not support two-phase commits out of the box, which maybe required if your database-backed application needs to guarantee immediate consistency AND you need to use two or more database schemas.
For many web applications, I would venture that this is not a common use-case. One can perfectly well support eventual consistency with two or more databases. Or one could support immediate consistency with one database schema. The former case is a great problem to have if your app has to support a mondo amount of transactions (note the technical term :). The latter case is more typical, and Rails does just fine.
Frankly, I wouldn't worry about limits to using Ruby on Rails (or any framework) until you hit real scalability problems. Build a killer app first, and then worry about scalability.
CLARIFICATION: I'm thinking of things that Rails would have a hard-time supporting because it might require a fundamental shift in its architecture. I'll be generous and include some things that are part of the gem/plugin ecosystem such as foreign key enforcement or SOAP services.
By two-phase commits, I mean attempting to make two commits to physically distinct servers within one transactional context.
Use case #1 for a two-phase commit: you've clustered your database, so that you have 2 or more database servers and your schema is spread across both servers. You may want to commit to both servers, because you want to allow ActiveRecord think do a "foreign key map" that traverses across the different servers.
Use case #2 for a two-phase commit: you're attempting to implement a messaging solution (sorry, I'm J2EE developer by day). The message producer commits to the messaging broker (one server) and to the database (a different server).
Also found some good discussion about the limits of ActiveRecord.
I think there is a greater “meta-question” here, that could be answered and that is “when is it OK to lean on external libraries to speed up development time?”
Third party libraries are often great and can drastically reduce development time, however there is a major problem, Joel Spolsky calls this “the law of leaky abstractions.” If you look that up on Google his post will come up first. Essentially this means that the trade off in development time means that you have no idea what is going on under the covers. So when something breaks you are completely stuck and have very limited methods of debugging. This also means that if you hit one of the features that are simply unsupported in RAILS, that you really need, you’ll have no next step except to write the feature yourself, if you’re lucky. Many libraries can make this difficult to do.
We’ve been burned badly in my dev shop by this issue. Our solutions worked fine under normal load, but we found that the third party subscription libraries that we were using simply could not stand up to the kind of load that we experienced once our site started to get a large number of concurrent users. This puts us in a very difficult place; essentially we have to rewrite the entire subscription service ourselves, with performance in mind. Doing this means that we’ve wasted all the time that we spent using the library.
Third party libraries can be great for small to medium sized applications; they can drastically reduce development time and hide complexities that aren’t necessary to deal with in the early stages of development. However eventually they will catch up with you and you’ll likely have to rewrite or re-engineer your solution to get past the “law of leaky absctractions”
Ruby don't have a functionality like IsPostBack in ASP.Net
Orion's answer is right on. There are few hard limits to AR/Rails: deploying to Windows, AR connectors that aren't frequently used, e.g. Firebird, ), but even the things he mentioned, multiple databases and DB servers, there are gems and plugins that address those for legacy, sharding, and other reasons.
The real limitation is how time-consuming it is to keep on top of all the things that rails devs are working on, and researching specific issues, given how many blogs, and how much mailing list volume there are.