I have a Rails inventory app that is available to global users, allowing them to enter their own inventory information and query those of others.
a British person in London adds 10 units of "bicycle" to the inventory table
a Japanese person adds 2 units of 自転車 (bicycle in Japanese)
a Vietnamese adds 5 units of xe dap (bicycle in Vietnamese)
The British person can query 'bicycle' and it will output all bicycles in the system (17 units) and can show the details of each in their original language, without the users classifying them beforehand. Likewise, the Japanese person can query '自転車', which will show all bicycles.
How can this be done?
The globalize gem requires users to manually translate each record so it's not the correct way. I've heard about machine learning and deep learning but I don't know if it's the right solution for this.
So if stackoverflow is not the right place to ask this? Where should I ask? Quora does not allow long questions.
Machine learning does not seem like a proper solution in this context since you don't have enough experience with it and it's a complex matter to just start with it and learn enough to apply to a real life problem.
Here are a few solutions you could implement today, as long as you understand the requirements and the up/downs for each, you will have to figure those out by yourself.
Since I don't have enough information about your system I'll try and generalize it to something that's likely.
Solutions:
1.Define a limited number of items for your system, like Bike and add
them to a config file or in a items database, each item having it's
unique id and when a user will have to add something they will have
to select from your list. Have a Other item as a catch-all, and
maybe provide a note so the users can add anything to recognize the
item.
2.Similar to the above solution but you give the users a way to add new items into the system, so you have 10 standard items and every user can add items to the site (those being moderated) and other users will have access to them.
3.Have a solid search system in place like Elasticsearch (or anything else), and when the user create items you index that item in the language that is entered, and then use Google translation API (or another translation service) to translate them in all the languages you need and index those for search as well.
I think solution 1 is the best if you are able to implement it followed by solution 2.
Related
We are developing a social app with Firebase (swift / iOS).
We face the problem that we have two data trees and have to calculate the delta without generating a high data traffic.
Example:
We have a structure cars and a structure user.
The structure cars contain 100 different vehicle models.
The user structure contains all vehicle models that have already been driven by the user.
We now want to implement a high-performance solution in order to determine all the vehicles that have not yet been driven by a user without downloading the whole tree structure.
The number of users and the number of vehicles are growing steadily.
Does anyone have a solution approach or idea in which direction we need to think?
love, alex
I think they key to effectively using firebase is data duplication. So if you want to display a list of cars the user has and hasn't driven, create a separate table containing only the information displayed in that list, like the path to an image, the make & model, using unique IDs as the keys to entries in that table. You wouldn't need to know things like top speed and price until they tap into details, right? (I'm making some assumptions here though.)
Then, simply get the list of unique IDs for the cars the user already has driven, and manipulate your offline model accordingly.
Right now I'm using an external server to manage data duplication, that propagates a write operation to other places in the database when necessary. I'm on my phone right now but I think Ray Wenderlich has an article about this.
We already have E commerce app Targeting 3 different countries with 3 different domains. It also uses 3 different DB.
Now we are going for IOS app. So my questions here are:
Can we upload a bundle for specific country only? (Available in that country only, if multiple bundles allowed for single app)
Should we handle JSON based DB request in a single bundle by checking user's location? (so single bundle handled by programming)
Our goal is here our app will allow only specific country's user to place order.
Also prices are different for different countries, prices are from server.
We don't have in app purchase prices.
Please let me know what option is best. Even if new please suggest.
Its a broad question with lots of good answers and unfortunately all of them are opinion based but I will give my two cents.
You can absolutely create multiple apps and target a specific country. You control this by changing the availability preference. (see pic) This will allow the app to be shown only in a certain country.
The advantage of this method is that you can have complete control & customizability specific to a country.
The disadvantage is that you are now maintaining multiple code bases. If you have a code bug in one app then you need to update 3 apps to fix the "same" bug. What if you want to support more countries. Now you have to create that many clones of the app. Think about if you had to add a new feature. Its snowballs pretty quickly doesn't it?
If you make one app then there is only one code base, one place to make all code changes or add features. Its somewhat easy to maintain code wise.
The bad side, well now you have to take care of every possible country specific scenario either within the app e.g. Localizations, currencies etc. or you have to get that information from your servers.
There are ways to find out through apis from which country a user is connecting from without asking the user itself.
In my opinion, creating one app is the way to go. It will save you lots of headaches down the road. But having said that I don't know how UBER or others big international players handle their country specific apps. Do they have one app or many. That I don't know.
What would be the better approach to let a user search for other users who use the app (using Parse.com as the backend) :
Import all the the data in the _User table then filter t in the app when using the UISearchBar
Querying parse for the search term and loading the results to the tableview
There is no "right" answer. It depends entirely on how you define "better."
Option 1 likely produces superficially the best user experience, in the sense that filtering a list on the fly looks a lot more responsive. But you have to schedule downloading the user list for when the user isn't already trying to search.
Option 2 is likely more efficient. Less bandwidth, less storage. But the user had to be online to search and you probably can't do a "real time" filter unless you're on a fast network.
There may be other factors also. I didn't want to expose a list of users, for example, so I went for option 2.
Basically I'm looking for a search engine that searches through a given database. The content will be text being searched.
You will probably want to use a service such as Solr. The easiest way to get started using it is to find a 'cloud' based version, such as Websolr. However, the solution will depend on what language you wish to use when programming your site.
Solutions depend somewhat on language:
1. For java/c#, you have lucene/solr
2. for python you have haystack
You could do text search in the DB directly via LIKE/ILIKE, but the performance depends on DB.
Iconfinder was coded specifically for icon search and at the time (launched in 2007) there were no scripts that worked well for this.
Building a search engine like Iconfinder is not rocket science. I think the hardest part is getting the SQL tuned and figure out how to rank the content. At the moment I collect data about impressions and downloads and calculate a value from that. The icons' rank is based on this value (download/impression) and how well keywords match the tags for the given icon.
My Rails app deals a lot with data from third-party APIs (specifically UPS, FedEx, DHL, etc).
What I'd like to do is whenever that data comes in, replace certain phrases with customized phrases.
Example: "On FedEx vehicle for delivery" (which we get from the FedEx API), I'd like to replace with "Out for Delivery."
Is it best to replace the the text on its way in to the database? Or on output? (Talking from an end-user speed perspective)
I'm planning on storing these phrases in our database, so I'm assuming I'd just create a helper that pulls the phrases I want to replace and then run the strings through those using gsub and replace as necessary?
Any tips on making this efficient and easy to manage would be great.
For speed you should replace the phrases when they enter the database. If you do it on output you'll have to do it every time an user requests the data. It is quite obvious that doing it every time will put more load on the server.
You may, however, want to store the original phrases, in case you want to change the wording in the phrases you replace with.
Just a random idea, which might not be applicable depending on how your data is, but maybe you could leverage the i18n framework that's built into Rails for this. The original text could be viewed as a separate language called vendorspeak :-).