I'm looking into creating a Bot for a messaging app, but need some help choosing a database for storing some basic user data.
I would use Json, but saw this answer and now I'm wondering what the "best" option is for server-side storage in general in terms of security and efficiency?
In general: What should i consider when picking a storage option? Is there a "right" choice in this case?
Related
I have a model. This is an LGBMClassifier with sklearn. With pickle, I saved the model in .sav format.
This is a trading model that will be used to trade US stocks.
Now the task is to use the model for real trading. We want to implement the code on .net. It means that the model was created in Python, but all code except the model itself will be implemented in another language.
I want to learn how to use the ready-made Python model in production when all the logic except the model itself will be implemented in another language. Can this be done?
I have never solved deployment issues. I will be grateful for any information on this subject. I want to understand in which direction should I move to make the model deployment according to the requirements above.
If there are any other ways to resolve this issue or the described approach does not make sense, I also will be grateful for the comments.
Thanks,
Yes,
One way to do this is
First put the model dump on AWS S3 [a storage service to store files which can be accessed online easily]
Write logic for loading model and predicting on AWS Lambda [ a serverless service to write small utilities which can be called from anywhere through API gateway ], put the AWS lambda behing API Gateway(aws) for accessing it with an API.
Call the api from your .net code and get results.
Above answer uses aws ecosystem, you can use others but the process will be same.
Good afternoon all!
Here's the background. I'm building a backend for a payment system. I want the canonical information for our Products and Plans to be on Stripe. As such I do not want to have a duplicated copy on our backend and pull them down from our RDB with ActiveRecord.
When someone calls for a list of Products or Plans I pull them down from Stripe, and then cache them. When we need to create/update/destroy them we make the API calls to Stripe and the webhooks trigger the updates.
I've not done something like this before, and normally just have an ActiveRecords model to CRUD.
Are there any best practices / methodologies to follow when doing something like this?
Thanks!
While it would be technically possible, I recommend against doing this in almost all cases. Your products are likely an integral part of your app. You will loose out on a lot of the idiomatic rails behaviors and abilities by trying to roll your own logic in this way. You'll also be coupling your app very tightly with a third party service that you in no way control.
I would seriously consider using standard AR backed models for this data in your app and taking the time to abstract out code that manages the YOUR APP <--> 3RD PARTY APP communication. This way your app is the canonical source of the information it manages, you get all of the rails' sugar, and you're in control in the event that you need to change back end providers.
Background
I have a fairly typical Rails application, which uses Devise for authentication management. While building this app, I realized that realtime chat would be a great feature to have. Ideally, of course, this would make use of Websockets, in order to reduce the polling load on the server (as well as making it marginally easier to implement, as you don't have to manage polling).
I realized quickly that Ruby isn't really a great fit for having a large number of concurrent connections open at one time. Phoenix, however, is written in Elixir, so I can make use of the Erlang VM, which is quite good at long connections. It also seems like it could be greatly beneficial if all the chat data was stored separate from the main application database, which should also reduce load in the future.
The Problem
I want to be able to make this separation completely invisible to the user. They visit www.example.com/chat, and it loads all the relevant data in from chat.example.com and starts up the websockets, without requiring them to login to a separate service. I think using an <iframe> is probably the way to go about doing this.
My problem is sharing authentication and data between the two applications. The Rails app needs to be able to create conversations on the Phoenix app in response to certain events. The Phoenix app needs to know what user is currently authenticated into Rails, as well as general data about the user.
An OAuth flow with the Rails app as the ID provider seemed like a good fit at first, but I can't figure out a way for the Phoenix app to automatically be granted access. I also have some concerns about user records existing inside the Phoenix app—it should be aware of all users on the main application, so you can start a chat with a user even if they haven't ever opened chat.
What would be the best way to go about doing this? My intuition says that this is going to involve window.postMessage and some kind of token system, but I wanted to ask what the generally accepted way of doing this was before I accidentally created an insecure mess.
Sharing the session isn't too hard, assuming you are running at least Rails 4.1 and using JSON serialization (default for apps created with >=4.1). A quick google search finds PlugRailsCookieSessionStore, which accomplishes this.
For more information on what it takes to share a session between Rails and another language, Matt Aimonetti has an excellent blog post with detailed information.
Lastly, if you would prefer to stay entirely in Ruby, it's definitely doable. Ryan Stout discusses scalability around persistent connections in the FAQ for Volt, which uses a persistent connection for every user. The article he links is also a great read. Just mentioning it to help you weigh the trade off of building a separate app in another language.
I'm going to be collaborating with a Python developer on a web
application. I'm going to be building a part of it in Ruby and he is
going to build another part of it using Django. I don't know much about
Django.
My plan for integrating the two parts is to simply map a certain URL
path prefix (say, any request that begins with /services) to the Python
code, while leaving Rails to process other requests.
The Python and Ruby parts of the app will share and make updates to the
same MySQL datastore.
My questions:
What do people think generally of this sort of integration strategy?
Is there a better alternative (short of writing it all in one language)?
What's the best way to share sensitive session data (i.e. a logged in
user's id) across the two parts of the app?
As I see it you can't use Django's auth, you can't use Django's ORM, you can't use Django's admin, you can't use Django's sessions - all you are left with is URL mapping to views and the template system. I'd not use Django, but a simpler Python framework. Time your Python programmer expanded his world...
One possible way that should be pretty clean is to decide which one of the apps is the "main" one and have the other one communicate with it over a well-defined API, rather than directly interacting with the underlying database.
If you're doing it right, you're already building your Rails application with a RESTful API. The Django app could act as a REST client to it.
I'm sure it could work the other way around too (with the rest-client gem, for instance).
That way, things like validations and other core business logic are enforced in one place, rather than two.
A project, product, whatever you call it, needs a leader.
This is the first proof that you don't have one. Someone should decide either you're doing ruby or python. I prefer ruby myself, but I understand those who prefer python.
I think starting a product asking yourself those kind of questions is a BAD start.
If your colleague only knows prototype, and you only know JQuery, are you going to mix the technologies too? Same for DB? And for testing frameworks?
This is a never ending arguing subject. One should decide, IMHO, if you want so;ething good to happen. I work with a lot of teams, as a consultant, Agile teams, very mature teams for some of them, and that's the kind of stuff they avoid at all cost.
Except if one of you is going to work on some specific part of the project, which REALLY needs one or other of the technologies, but still think the other one is best for the rest of the application.
I think, for example, at a batch computing. You have ALL your web app in ror or django, and you have a script, called by CRON or whatever, computing huge amounts of data outside the web app, filling a DB or whatever.
My2Cts.
Using Symfony 1.4.x (with Propel), I've been given a task that requires me to share specific user info with multiple external systems. This is currently stored as a session (in memory) attribute, but I need to get it into a database so that I can create an API that will provide that info to authorized consumers.
I'd rather not overhaul the system to store all session data in the database (unless it's trivial and can handle namespaces), but I can't find any information on a recommended way for the myUser class to write data to the database. Is it possible to do this since the class doesn't have a model, per se (that I'm aware of)? Are there any recommended solutions or best practices for doing this?
Thanks.
UPDATE
If I were to boil this all the way down to its bare essentials, I guess the key question is this: What's the "best" way to read from/write to a database from the myUser class? Or, alternatively, is there another path that's recommended to accomplish the same end result?
Will storing result of json_encodeing or serializeing of
$myUserInstance->getAttributeHolder()->getAll()
do the job?
In the absence of a good means of accessing a database from the myUser class, I opted for a slightly different path. I installed memcached in a location accessible by both apps and extended PHP's Memcached class to apply a few customizations. The apps can now share information by writing specially formatted keys to the memcached instance.
I opted not to overhaul my existing cache storage mechanism (why upset the apple cart?) and am reading from/writing to memcached selectively for information that truly needs to be shared.