I've been developing a web application for a very amazing charity for the past two months. I have a Docker image, container, repository etc. and want to publish my web app for the public to see and interact with. It is a UK/Ireland based charity.
The problem is that hosting seems to be quite expensive. Is there any discounted/free hosting services available for non-profits to host their applications? I've seen some for US Charities. Are there any services providing a discount which would be able to host my Docker container? Right now it is sitting on the free version of Heroku, which is definitely not suitable for a lot of traffic.
(Also, I'm new to hosting/Docker so any tips/first steps would be well appreciated!)
Azure has an excellent grant plan for verified NPOs you can use for anything (we are using it).
Azure isn't the most user-friendly, but it's probably worth the effort to get to know it.
Amazon Web Services provides a grant of $1000 per year, but you have to pay a $95 admin fee.
AWS, through TechSoup, will make one grant of $1,000 in credits per fiscal year (July 1 to June 30) to eligible 501(c)(3) organizations. Organizations can apply these service credits toward usage fees for all AWS on-demand cloud services, as available by region. AWS credits are not valid for Amazon EC2 Reserved Instances, Amazon Mechanical Turk, AWS Marketplace, Amazon Route53 domain name registration or transfer, or any upfront fee for any service. They are available for monthly support fees.
Related
A client told me to deploy a docker image for him on Cloud Run. I am able to do that in my own account. Once it is done, the problem is that it is to me to pay the maintenance fee. Do you know the best practice to avoid this problem ?
Should I tax my client every month ? Or should I deploy the docker image on the gcp account of my client ? Or is it possible to directly tax my client from my gcp account ?
Thanks in advance and sorry for this silly question.
Best regards.
You only have two choices:
deploy in your project and pay the fees. How you manage reibursement is up to you.
deploy in your customer's project and they pay the fees.
Best practice: deploy in their project so that they own the service and are responsible for that service, e.g. following Google Terms of Service.
My server is in Sweden (DB, ruby on rails web app, nginx). Sometimes people from South Africa and Southeast Asia come to my site.
For these people, the site is very stupid. Ping from Southeast Asia about 200ms. I want to solve this problem. I have no experience with such situations.
CDN will not help because it is only for static information.
I thought I need to make three servers for each region (1 server 1 nginx 1 webapp 1 base 1 region) and configure replication between databases. But I was told it was bad. How then? Maybe Amazon Cloud with RDS?
If you're on AWS could use a Load Balancer with your service running on multiple availability zones.
This is the same on AWS or Google Cloud
There are however a few things to take into consideration, for instance:
where is your db?
is it distributed?
is your service DB hungry?
Every scenario is easy to solve on AWS just by clicking around; meaning adding your service to multiple availability zones is as easy as making your db available across multiple regions as well.
If your server provides a RESTful interface and you want to use AWS for any reason, I would recommend API Gateway, just because it already has an edge-optimization feature: https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-basic-concept.html
"Edge-optimized API endpoint: The default host name of an API Gateway API that is deployed to the specified region while using a CloudFront distribution to facilitate client access typically from across AWS regions. API requests are routed to the nearest CloudFront Point of Presence (POP) which typically improves connection time for geographically diverse clients. An API is edge-optimized if you do not explicitly specify its endpoint type when creating the API."
I am trying to set up a data connection for Watson Analytics and IBM Data Refinery states I have maxed out my data limits. Is there a way to clear or reset?
The default IBM Cloud account type, known as a Lite account, provides access to all IBM Cloud based services that provide a Lite plan option. Those plans provide free, but limited, access to services. For Data Refinery (part of the Watson Knowledge Catalog & Watson Studio services), the lite plan has a limit of 50 Capacity Unit Hours per month. The message you're seeing appears when you have reached the limit of the free usage the plan provides. The limit is reset each calendar month and can be removed by upgrading your IBM Cloud account and associated service plan(s)
You can check which plan your services are on via this link:
https://dataplatform.ibm.com/console/overview
For information on account types, see here:
https://console.bluemix.net/docs/account/index.html#accounts
For information on changing service plans, see here:
https://console.bluemix.net/docs/account/change-plan.html#changing
For information on Capacity Unit Hours in relation to Data Flows (which Data Refinery creates), see here:
https://www.ibm.com/cloud/watson-knowledge-catalog/faq
I'm new to Amazon's cloud though I have used other cloud provides like Rackspace, Windows Azure and Heroku. I want to deploy my Ruby on Rails 4 application on Amazon but I am overwhelmed with all of the services Amazon offers. AWS, EC2, EBS, S3, SimpleDB, Elastic Beanstalk.... argh!!
My site is a relatively simple Rails app with a Postres database. There will not be much traffic at launch but we obviously hope it will grow and need to scale up.
What is a simple, no-frills plan that Amazon offers to get my app out there? I feel like I need to read 100 pages of documentation just to understand what it is that Amazon is offering.
First of all, there are no plans. You sign-up for an AWS account, and you have access to whichever services you want to use.
Secondly, I can wholeheartedly recommend a single-instance Elastic Beanstalk environment to get started. It only uses 1 EC2 virtual server behind the scenes, but you get much better deployment options.
I can't speak to other services like Heroku.
I will developp and host an e-commerce website based on Asp.Net MVC4 (with several SQL Server Jobs).
I think use Azure in order to stay in Microsoft's world and avoid dedicated server management.
The package Web Site Shared with 1 site / 5Go SQL Server Database / 200Go Bandwidth is very interesting with the price based on 12 months.
But i don't know if this configuration is enough specially on the bandwidth.
What do you think of ? Did you use Azure with this type of application ?
Regards,
Guillaume.
If you want to develop E-Commerce application you will have to secure customers' sensitive data i.e. credit cards, address details etc. via secure connections (HTTPS; in many countries this is legal requirement). For that reason you will have to have SSL support.
Azure Website do not support SSL for custom domains. However, they support SSL for *.azurewebsites.net DNS name. So if your E-Commerce application DNS will be, say, my-ecom-app.azurewebsites.net then it's fine. Otherwise, I would not recommend Azure Website solution yet (I am sure SSL support for custom domains on Azure Website will be implemented).
Azure Cloud Services, on the other had, have full support of SSL for custom domains.
One of the really good websites to check Azure features and development roadmap is ScottGu's Blog
Azure Web Sites do not support SSL and I really don't know of any successful e-commerce site that does not run SSL for at least part of the website. If you really want to host your e-commerce on Azure today your only real choice is to run Virtual Machines for your web front end servers and use them for your DB or use SQL Azure.
We developed platform called Virto Commerce that does just that, MVC4 website hosted on Azure. There was also a need for SQL Jobs (indexing, payment processing, cart cleanups and so on) for which we used WorkerRole (instead of WebRole). WorkerRole and WebRole can actually be combined as part of a single deployment, however it is better to use a different instance for worker roles. In our case WorkerRole acted as a scheduler for multiple jobs defined in the database.
The challenge with WorkerRoles however is to make sure they scale well when new instances are added. So the workload needs to be distributed between multiple instances. This is done through the use of queues and blob locks, where each job is now split into two, one that schedules and partitions the work and the second that actually picks up the next partition and completes it.
Hope this helps!
PS: Virto Commerce is now available as an open source project on codeplex, go to http://virtocommerce.codeplex.com