Currently starting plan for Maria DB is only 1 GB. I would rather expect to have next plans: S - 5 GB, M - 35 GB and L - 100 GB. Is there any plan to extend Maria DB storage plans?
We are expecting to release one App hosted in Swisscom Cloud that could produce 500 MB under Maria DB each month. With current Maria DB plans we would rapidly go out of storage.
Swisscom will change price model for MariaDB as a Service and renew/optimise the Galera setup. This will happen soon. Please subscribe to the newsletter for dates and announcments. The new MariaDB service will be billed by disk usage. The price model will be similar to S3 (dynstrg).
Please contact Swisscom Support product management for capacity planning. I guess the current setup won't be suitable for your growth needs.
Related
Planning to subscribe Aura cloud managed services with memory 4GB, 0.8 CPU and 8 GB storage plan.
But the storage is not enough. Is it possible to increase the storage in this plan?
How many core of CPUs in this plan if its mentioned as 0.8 CPU?
The Aura pricing structure is very simple. You can increase storage (or memory or CPU) by paying for a higher-priced tier. Of course, you can contact Neo4j directly to ask if they have any other options.
0.8 CPU means that you get the equivalent of 80% of a single core.
You can get more details from the Aura knowledge base and developer guide.
I started a VM instance for an ML task that needs to train a model on a 2G data set. I use connected the VM to Google's datalab and loaded the 2G dataset using from GCP's bucket. The VM has a standard "n1-highmem-16" machine type.
Datalab automatically disconnects in 1-2 hrs, but I was charged $10 for simply loading the 2G data to the memory. I was wondering if it was because I did not shut down the VM soon enough so there was an on-going charge, so I reload the same dataset again and monitored the charges. I found that I was charged $2 in 2 minutes for that task. I expect the on-going charges to accumulate fast.
These confusing charges basically makes it impossible for me to finish a project completely on GCP. Does anyone have suggestions on anything that I have done wrong in creating the VM or handling the task so that I got charged this much? If not, does anyone have a suggestions for more reasonable cloud computing sources?
You can reach out to the GCP Cloud Billing Support regarding your issue with billing of charges for GCP resources. In the meanwhile, you can look into the GCP Pricing in order to have a better understanding on the specific pricing for different resources.
Its better to open a issuetracker case or billing team of gcp for better overview of the incurred charges
How much bandwidth and storage do I need to store a small online shop, with let's say 200-250 products and how should I know how many visitors my site has monthly? Can you give me some real example, in order to make myself an idea? If you already have an online store or already existing online stores (Stradivarius, Zara, something smaller?)
Thanks in advance:)
Simply put:
Domain name - pays yearly ~ $10-40 (reasonable).
Web hosting - pays monthly ~ $3-10.
I wouldn't worry about disk space and traffic. For example some hosters give 50GB space + Unlimited traffic for ~$6/mo.
Shop engine - mostly free, 50 - 250 MBytes space. Generally your products require MySQL database space, which is commonly sufficient for your store size in hosting plans priced in part 2.
WooCommerce, PrestaShop and OpenCart will require less space,
Monsters like magento will require more space.
Statistics - To monitor bandwidth you can use Google Analytics.
SEO - if your web store is not popular, there is no point in having it.
If you are a startup you can first leverage free online shopping platforms like ebay, and later trouble yourself with self hosted web shop.
P.S. Estimations on traffic are 1-2MB / per page. If we assume 25% cache hits for images (user should see new content every other link), and interest rate like 7 products per user.
We can very roughly estimate, that one user produce ~11MB of traffic. I.e. 1.1GB per 100 users.
P.P.S. If you optimize content it will be significantly less.
Does anyone here can help me compare the price/month of these two elasticsearch hosting services?
Specifically, what is the equivalent of the Bonsai10 that costs $50/month when compared to the amazon elasticsearch pricing?
I just want to know which of the two services saves me money on a monthly basis for my rails app.
Thanks!
Bonsai10 is 8 core 1GB memory 10GB disk, limited to 20 shards & 1 million documents.
Amazon's AES doesn't have comparable sizing/pricing. All will be more expensive.
If you want 10GB of storage, you could run a single m3.large.elasticsearch (2 core 7.5GB memory, 32GB disk) at US$140/month.
If you want 8 cores, single m3.2xlarge.elasticsearch (8 core 30GB memory, 160GB disk) at US$560/month.
Elastic's cloud is more comparable. 1GB memory 16GB disk will run US$45/month. They don't publish the CPU count.
Out of the other better hosted elasticsearch providers (because they list actual resources you receive, full list below), qbox offers the lowest cost comparable plan for US$40/month for 1GB memory 20GB disk. No CPU count https://qbox.io/pricing
Objectrocket
Compose.io (an IBM company)
Qbox
Elastic
i have a new instance in GCE and after some days i migrate my websites and now are running in GCE.
I like to know what amount of disk i have available in GCE.
I used the monitor tools but i not found the information only found the total volume of the disk but not the amount of size the disk used or the available.. it's posible?
The amount of storage you have available to your project is determined by the resource quota in place, which in turn is determined by the project's billing. This could be:
Free trial - GB of PD: 10240 - quota can not be modified - more info here 1
Billing enabled - Standard PD total GB: 5120; SSD PD total GB: 1024; Local SSD total GB: 1500 - quota can be increased upon request. More info here 2
By default, linux VMs deploy with a 10 GB disk, unless an already existing disk was used; the default is 100 GB for Windows machines.