Can I cache mp3 files using service worker? [duplicate] - service-worker

This question already has answers here:
Cannot play cached audio from service worker
(2 answers)
Closed 1 year ago.
Is there any restriction on what kind of files could be cached by a service worker?

Check out https://samdutton.github.io/samples/service-worker/prefetch-video/ which works around this issue by manually created ranged responses.
Fixing this is gated on figuring out what browsers should be doing here, and updating the service worker spec if needed.
Original answer: https://stackoverflow.com/a/37614302/6773912

There are no restrictions on the kind of files you can cache.

Related

How to manage almost concurrent threads in RoR 6

Hoping someone can help me clarify the question itself. Not an expert at all in programming.
Until about 2 months ago incoming requests to my RoR app would (I think) be ran sequentially.
I have a lot of after_remove, before_remove,after_add actions for some of those requests.
Since the beginning of the year errors started popping up about ids being already taken, and external resources would be created twice because of incoming requests coming in for the same resource while the first was not finished processing.
Was there a change recently in the way requests are handled in RoR that I can turn off, other than simply going back to the previous rails versions?
Thanks

Ruby Concurrency in cron job needed [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am developing a system in which the API should handle simultaneous, continuous by rails 4.0
In system, each user has 3 scripts to be run in background. The scripts grab the user's information from DB to call API repeatedly and process transaction. Currently I am using cronjob (whenever gem) to run scripts in the background for each individual user
So my problem is when the system has 1,000 people, I need to run 3000 cronjobs.
I think this system will have problems. Can anyone help me solve this problem?
At this point you have a system that performs some tasks periodically, and the amount of work your system has to handle (let's say, per hour) is less than the amount of work it could handle.
However, the amount of work increases with the number of users in your system so, as you have already guessed, there will be a moment when the situation will be critical. Your system will not be able to handle all the tasks it has to do.
One way to solve this problem is adding more machines to your system, that is, if you are currently using a single machine to run all your tasks, consider adding another one and split the job. You can split the job between the machines in a number of ways, but I would use a consumer-producer approach.
You will need to use a queue manager where your producer periodically sends a batch of tasks to be done (you can still use whenever gem for that) and a number of consumers (1 makes no sense, 2 would be OK by now but you could increase this number) get the tasks done one by one until there is none left.
The manager I like the most is Sidekiq but you can find some others that might match your needs better.

Implementing a online compiler [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am doing a small project which is to host a site similar to ideone.com ie to host an online compiler which will compile and run the code which is given as an input.I am using ROR for the backend part.
What I have done is that the code which is inputted in the textbox is stored into the string and I am using system calls in ruby to create a file and I am storing the string in that file.Similary I am also storing the input for the code in another file.Again I am using system calls to compile and run the file and storing the output into the string and sending it to the front end part.
I have got two problems for the above implemented method
1) It will only work for a single user at a time.Any idea how to implement for multiple users and if yes what will be the limit of the number of users?
2) Anyone can put a malicious code and harm the system.I have to sandbox the environment such that it will run at an isolated environment. How can I do it?
Program running infinity loop is not a problem as I have put the limit on the execution time.I am using backtick to execute the shell script.I am implementing it for C, if I succeed to solve all the problems I would extend it to other languages.
For the sake of not letting people wipe out your hard drive, install spambots etc, you will need to run all code inside a virtual machine, to protect the host. This also solves the user problem since you can spin up a virtual machine for each user and spin it down after running the code. However, this might use a lot of resources on your server.
I'd be interested to find out what ideone.com does. I suspect that everything runs in the client's browser, which is obviously much safer since you can just use your server to save their code, but not actually run it. If it runs in their browser it is sandboxed anyway. Are you sure that you don't want to do this instead? I've never heard of anyone letting people upload code and then run it on the system server. Seems kind of insanely risky.

Importing data that may take 10-15 minutes to process, what are my options in Rails? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I have a Rails application that displays thousands of products.
The products are loaded from product feeds, so the source may be a large XML file or web service API calls.
I want to be able to re-use my models in my existing rails application in my import process.
What are my options in importing data into my Rails application?
I could use sidekiq to fire off rake tasks, but not sure if sidekiq is suitable for tasks that take 10+ minutes to run? Most use cases that I have seen is for sending of emails and other similiar light tasks
I could create maybe a stand-alone ruby script, but not sure how I could re-use my Rails models if I go this route.
Update
My total product could is around 30-50K items.
Sidekiq would be a great option for this as others have mentioned. 10+ minutes isn't unreasonable as long as you understand that if you restart your sidekiq process mid run that job will be stopped as well.
The concern I have is if you are importing 50K items and you have a failure near the beginning you'll never get to the last ones. I would suggest looking at your import routine and seeing if you can break it up into smaller components. Something like this:
Start sidekiq import job.
First thing job does is reschedule itself N hours later.
Fetch data from API/XML.
For each record in that result schedule a "import this specific data" job with the data as an argument.
Done.
The key is the second to last step. By doing it this way your primary job has a much better chance of succeeding as all it is doing is reading API/XML and scheduling 50K more jobs. Each of those can run individually and if a single one fails it won't affect the others.
The other thing to remember is that unless you configure it not to Sidekiq will rerun failed jobs. So make sure that "import specific data" job can be run multiple times and still do the right thing.
I have a very similar setup that has worked well for me for two years.

Task engine for Ruby [duplicate]

This question already has answers here:
Scheduling tasks with rails
(2 answers)
Closed 9 years ago.
We have an application that needs to perform lots of short background tasks (each might schedule more). In the future we might need to run the tasks on multiple servers.
We also need the tasks (and their parameters) to be persistent (stored in a DB) and be able to monitor tasks (status / logs /etc)
Is there a ready made solution that works with Ruby ?
You can try delayed_job or resque.

Resources