Searching for a particular trading platform - currency

I'm trying to find an "Exchange/trading platform" with the following conditions
I must be able to purchase currencies (fiat or crypto) and not need to return back to the base currency. I do not wish to open and close positions. I plan to only take, never make. - Is there a name for this?
The commission needs to be as low as possible preferably less than 0.1% per trade (I want to high-frequency trade)
I'm after API access for the High-frequency trading.
I'm after a large grid/table (every currency into every other currency) for example:
Ideally, 9+ currencies. - I have made this in excel using modified data from https://www.exchangerates.org.uk/currency/currency-exchange-rates-table.html
If all of these exist on a single website, that would be wonderful!
Thank you in advance for any and all help.

Binance.com (crypto) has what you want.
API Access: https://github.com/binance-exchange/binance-official-api-docs
Low Fees: 0.075% for a taker trade
If you use my referral link you will get an additional permanent 10% trading fee reduction.
https://www.binance.com/en/register?ref=OTWO00WS
(I also get 10% so it's a win win)

Related

Can a group of 3 researchers share/pool Twitter API tokens to accelerate/improve data collection on a sentiment analysis project?

Our group is working on a sentiment analysis research project. We are trying to use the Twitter API to collect tweets. Out aimed dataset involves a lot of query terms and filters. However, since each of us has a developer account, we were wondering if we can pool API access tokens to accelerate the data collection. For example, we will make an app that allows us to define a configuration file that contains a list of our access tokens that the app will try to use to search for a tweet. This app will be run on our local computer. Since the app uses our individual access tokens, we believe that we are not actually not bypassing or changing any Twitter limit as the record is kept for each access token. Are there any problems legal/technical that may arise from this methodology? Thank you! =D
Here is a pseudocode for what we are trying to do:
1. define a list of search terms such as 'apple', 'banana'
and 'oranges' (we have 100 of these search terms, we are okay
with the 100 limit per tweet)
2. define a list of frequent emotional adjectives such as 'happy', 'sad', 'crazy', etc. (we have have 100 of these) using TF-IDF
3. get the product of the search terms and emotional adjectives,
in total we have 10,000 query terms and we have computed
through the rate limit rules that we would need at least
55 runs of 15-minute sessions with 180 tweets per 15-minute.
55 * 15 = 825 minutes or ~14 hours to collect this amount of tweets.
4. we were thinking of improving the data collection by
pooling access tokens so that we can trim down the time
of collection from 14 hours to ~4 hours, e.g. by dividing the query items into subsets and letting a specific access token work on a subset
We were pushing for this since we just think it's efficient if it's possible and permitted since why not and it might help future researches as well?
The question is, are we actually breaking any Twitter rules or policies by doing this? By sharing one access token per each of us three and creating an app that we name as clones of the research project, we believe that in turn we are also losing something which is the headroom for one more app that we fully control.
I can't find specific rule in Twitter so far about this. Our concern is that we will publish a paper and will publish the app we will program and use for documentation and the app we plan to build. Disclaimer: Only the app's source code will be published and not the dataset because of Twitter's explicit rules about datasets.
This is absolutely not allowed under the Twitter Developer Policy and Agreement.
Twitter developer policy 5a:
Do not do any of the following:
Use a single application API key for multiple use cases or multiple application API keys for the same use case.
Feel free to check with Twitter directly via the developer forums. StackOverflow is not really the best place for this question since it is not specifically a coding question.

Adwords API Cost Per Click from a Gclid

I am really would like to be able to pull out the cost of a click from adwords with a gclid. I can upload this to adwords but we have rich meta data about the customer after they have completed our online application, I would like to be able to do some analysis on this data, but need to find out the cost of the customer.
Unfortunately, Google do not release the cost information at that granular level. They do provide a click performance report (details here) but performance metrics such as cost, conversion, etc. are all infuriatingly absent.
You can, of course, match the gclId to a unique criteriaId, device, date, etc. then get the cost for that combination but, at least in general, you'll lose a little information in the aggregation unless there happens to be just a single click in that segment.

Black Scholes options pricing

I am investigating how involved creating a very simple options trading platform will be(not for profit but for learning purposed). Can someone please explain the process flow of how Black Scholes option pricing is used within trading platforms, the below is my understanding please correct me if i am mistaken:
1) in memory prices of options derived from Black Scholes formula.
2) an incoming buy order for an option in FIX protocol format.
3) trading platform compares the price of buy order with the price derived from Black Scholes and decides to buy accordingly.
please correct me if i am mistaken anywhere thanks in advance
1) in memory prices of options derived from Black Scholes formula.
That is the job of the user application, Quickfix will not help in any matter regarding this.
2) an incoming buy order for an option in FIX protocol format.
You take the message, parse it and store the rwquired information for yourself.
3) trading platform compares the price of buy order with the price derived from Black Scholes and decides to buy accordingly.
This again is the job of the user application. Inforamtion collected from Step 2 will help you in this.
FIX is a message format for communicating, which should be kept away from your programming logic. Else it will slow down the messaging unnecessarily for no apparent gain.

Bloomberg real-time data with lot sizes

I am trying to download real-time trading data from Bloomberg using the api.
So far I can get bid / ask / last prices successfully but in some exchanges (like canada) quote sizes are in lots.
I can query the lots sizes of course with reference data api and write them for every security in the database or something like that but to convert the size for every quote tick is very "expensive" conversion since they come every second and maybe more often.
So is there any other way to achieve this?
Why do you need to multiply each value by lot size? As long as the lot size is constant each quote is comparable and any computation can be implemented using the exchange values. Any results scaled in a presentation layer if necessary.

Reverse geocoding services

I'm working on a project that returns information based on the user's location. I also want to display the user's town in text (no map) so they can change it if it's not accurate.
If things go well I hope this will be more than a small experiment, so can anyone recommend a good reverse geocoding service with the least restrictions? I notice that Google/Yahoo have a limit to the number of daily queries along with other usage terms. I basically need to take latitude and longitude and convert them to a city/town (which I presume cannot be done using the HTML5 Geolocation API).
Geocoda just launched a geocoding and spatial database service and offers up to 1K queries a month free, with paid plans starting at $49 for 25,000 queries/month. SimpleGeo just closed their Context API so you may want to look at Geocoda or other alternatives.
You're correct, the browser geolocation API only provides coordinates.
I use SimpleGeo a lot and recommend them. They offer 10K queries a day free then 0.25USD per 1K calls after that. Their Context API is what you're going to want, it pretty much does what is says on the tin. Works server-side and client-side (without requiring you to draw a map, like Google.)
GeoNames can also do this and allows up to 30K "credits" a day, different queries expend different credit amounts. The free service has highly variable performance, the paid service is more consistent. I've used them in the past, but don't much anymore because of the difficulty of automatically dealing with their data, which is more "pure" but less meaningful to most people.

Resources