How do I get / retrieve my REST data using ember-data? - ruby-on-rails

I've been scouring for documentation on the REST adapter that is packaged with ember data, but I can't seem to find any information on how to actually have ember make the json request to the server, or how to retrieve or access the data once it has made the request using this adapter ( the documentation on the ember-data page seems to all be about rolling your own adapter, besides a small paragraph on how to specify if you need to disable bulk commits, though maybe I'm just missing something )

You have to tell your store to use the DS.RESTAdapter and this handles the communication with your server via AJAX calls, see a basic example here
You can get a basic overview how the RESTAdapter is used in the tests.
App = Ember.Application.create({});
App.store = DS.Store.create({
revision: 3,
adapter: DS.RESTAdapter.create({
ajax: function(url, type, hash) {
console.log(arguments);
}
})
});
App.Person = DS.Model.extend({
});
App.Person.createRecord({
});
// tell the store to contact REST service
App.store.commit();
​

Related

POST data to Next.js page from external application

I have a java application which sends json data to an API via POST, what i'm trying to do is collect this data from the Next.js application to display and store in a database later. I can't figure out how to fetch this data from the Next app. currently i have the following code in the pages/api/comment and I'm calling the http://localhost:3000/api/comment from the java application
export default function handler(req, res) {
if(req.method === 'POST'){
const comment = req.body.data
const newCom = {
id: Date.now(),
text: comment,
}
comments.push(newCom)
res.status(201).json(newCom)
}
}
Can someone give me some directions please?, Thank you very much in advance.
Since Next js is working in server less architecture, you need to persist/save data some where like DB on the time of posting data, Then only we an retrieve data by get Api

How do I display data from Rails backend to Ember frontend without saving it in the database and without using Ember-Data?

The problem:
I have used rest-client to pull news articles from newsapi.org through my Rails backend. I can display that data on localhost:3000/articles. I do not (necessarily) want to save this data to a database and would like to simply display it on my Ember frontend after calling it on my Rails backend. I get this error:
Error while processing route: article Assertion Failed: You must include an 'id' for article in an object passed to 'push'
I understand this to mean that my data has no 'id' - which it doesn't. I have no control over the data coming in.
{
"status": "ok",
"totalResults": 6868,
"articles": [
{
"source": {
"id": "mashable",
"name": "Mashable"
},
"author": "Lacey Smith",
"title": "Cannabis may alter the genetic makeup of sperm",
"description": "A new study suggests that marijuana use can not
only lower sperm count, but also affect sperm DNA. Read more... More
about Marijuana, Cannabis, Sperm, Science, and Health",
"url": "https://mashable.com/video/cannabis-sperm-dna/",
"content": null
},
Ember-Data doesn't like this because it's not in JSONAPI format and I can't change it to JSONAPI. How do I display this data on my ember frontend without using Ember-Data and without having to save it into my database? The tutorials I've found all have examples using databases and Ember-Data.
What I've already tried:
I tried to save it to my database and load from there, but I'm thinking that might complicate the idea, since I would need to scrape the data and save it to my database and that has proven difficult to do. I'm not experienced with backend stuff and haven't found many answers. If you have an answer for doing it this way, feel free to answer here:
How to automatically save data from url to database
I'm willing to try it out. I thought originally this would be the best way to go. I could save the data and only add to the database when new articles were returned. I'm not having any luck this way, so I thought I'd approach it from another direction.
Thanks for any help!
If you want to do some processing outside of ember-data, create a service. In your service, you will need to then make an ajax/fetch request to your backend. Let's say you are using ember-ajax to make your requests:
import Service, { inject } from '#ember/service';
export default Service.extend({
// inject the ember-ajax ajax service into your service
ajax: inject(),
//this returns a promise
getNewsItems(){
var options = {
// request options (aka anything needed to authenticate against your api, etc)
};
let url = "yourBackendEndpoint/for/this/resource";
return request(url, options).then(response => {
// `response` is the data from the server
// here you want to convert the JSON into something useable
return response;
});
}
});
Then, let's say in a model hook for some route, you needed this service (which we will say is defined in your-app/services/my-service.js) to fetch the news feed items:
import Route from '#ember/route';
import { inject } from '#ember/service';
export default Route.extend({
myService: inject(),
model(){
let myService = this.get('myService');
return myService.getNewsItems();
}
)
I personally use ember-fetch which avoids the jQuery dependency and is useable in all target browsers for my app. I do not use ember-data in any of my apps.

How to retrieve Medium stories for a user from the API?

I'm trying to integrate Medium blogging into an app by showing some cards with posts images and links to the original Medium publication.
From Medium API docs I can see how to retrieve publications and create posts, but it doesn't mention retrieving posts. Is retrieving posts/stories for a user currently possible using the Medium's API?
The API is write-only and is not intended to retrieve posts (Medium staff told me)
You can simply use the RSS feed as such:
https://medium.com/feed/#your_profile
You can simply get the RSS feed via GET, then if you need it in JSON format just use a NPM module like rss-to-json and you're good to go.
Edit:
It is possible to make a request to the following URL and you will get the response. Unfortunately, the response is in RSS format which would require some parsing to JSON if needed.
https://medium.com/feed/#yourhandle
⚠️ The following approach is not applicable anymore as it is behind Cloudflare's DDoS protection.
If you planning to get it from the Client-side using JavaScript or jQuery or Angular, etc. then you need to build an API gateway or web service that serves your feed. In the case of PHP, RoR, or any server-side that should not be the case.
You can get it directly in JSON format as given beneath:
https://medium.com/#yourhandle/latest?format=json
In my case, I made a simple web service in the express app and host it over Heroku. React App hits the API exposed over Heroku and gets the data.
const MEDIUM_URL = "https://medium.com/#yourhandle/latest?format=json";
router.get("/posts", (req, res, next) => {
request.get(MEDIUM_URL, (err, apiRes, body) => {
if (!err && apiRes.statusCode === 200) {
let i = body.indexOf("{");
const data = body.substr(i);
res.send(data);
} else {
res.sendStatus(500).json(err);
}
});
});
Nowadays this URL:
https://medium.com/#username/latest?format=json
sits behind Cloudflare's DDoS protection service so instead of consistently being served your feed in JSON format, you will usually receive instead an HTML which is suppose to render a website to complete a reCAPTCHA and leaving you with no data from an API request.
And the following:
https://medium.com/feed/#username
has a limit of the latest 10 posts.
I'd suggest this free Cloudflare Worker that I made for this purpose. It works as a facade so you don't have to worry about neither how the posts are obtained from source, reCAPTCHAs or pagination.
Full article about it.
Live example. To fetch the following items add the query param ?next= with the value of the JSON field next which the API provides.
const MdFetch = async (name) => {
const res = await fetch(
`https://api.rss2json.com/v1/api.json?rss_url=https://medium.com/feed/${name}`
);
return await res.json();
};
const data = await MdFetch('#chawki726');
To get your posts as JSON objects
you can replace your user name instead of #USERNAME.
https://api.rss2json.com/v1/api.json?rss_url=https://medium.com/feed/#USERNAME
With that REST method you would do this: GET https://api.medium.com/v1/users/{{userId}}/publications and this would return the title, image, and the item's URL.
Further details: https://github.com/Medium/medium-api-docs#32-publications .
You can also add "?format=json" to the end of any URL on Medium and get useful data back.
Use this url, this url will give json format of posts
Replace studytact with your feed name
https://api.rss2json.com/v1/api.json?rss_url=https://medium.com/feed/studytact
I have built a basic function using AWS Lambda and AWS API Gateway if anyone is interested. A detailed explanation is found on this blog post here and the repository for the the Lambda function built with Node.js is found here on Github. Hopefully someone here finds it useful.
(Updating the JS Fiddle and the Clay function that explains it as we updated the function syntax to be cleaner)
I wrapped the Github package #mark-fasel was mentioning below into a Clay microservice that enables you to do exactly this:
Simplified Return Format: https://www.clay.run/services/nicoslepicos/medium-get-user-posts-new/code
I put together a little fiddle, since a user was asking how to use the endpoint in HTML to get the titles for their last 3 posts:
https://jsfiddle.net/h405m3ma/3/
You can call the API as:
curl -i -H "Content-Type: application/json" -X POST -d '{"username":"nicolaerusan"}' https://clay.run/services/nicoslepicos/medium-get-users-posts-simple
You can also use it easily in your node code using the clay-client npm package and just write:
Clay.run('nicoslepicos/medium-get-user-posts-new', {"profile":"profileValue"})
.then((result) => {
// Do what you want with returned result
console.log(result);
})
.catch((error) => {
console.log(error);
});
Hope that's helpful!
Check this One you will get all info about your own post........
mediumController.getBlogs = (req, res) => {
parser('https://medium.com/feed/#profileName', function (err, rss) {
if (err) {
console.log(err);
}
var stories = [];
for (var i = rss.length - 1; i >= 0; i--) {
var new_story = {};
new_story.title = rss[i].title;
new_story.description = rss[i].description;
new_story.date = rss[i].date;
new_story.link = rss[i].link;
new_story.author = rss[i].author;
new_story.comments = rss[i].comments;
stories.push(new_story);
}
console.log('stories:');
console.dir(stories);
res.json(200, {
Data: stories
})
});
}
I have created a custom REST API to retrieve the stats of a given post on Medium, all you need is to send a GET request to my custom API and you will retrieve the stats as a Json abject as follows:
Request :
curl https://endpoint/api/stats?story_url=THE_URL_OF_THE_MEDIUM_STORY
Response:
{
"claps": 78,
"comments": 1
}
The API responds within a reasonable response time (< 2 sec), you can find more about it in the following Medium article.

Rails: sleep until there is data to respond with (streaming + multithreading)

I am building a Rails/Javascript application which is supposed to support real-time notifications. JavaScript is:
var chat;
$(document).ready(function(){
chat = $('#chat');
chat.append('mmm');
(function poll(){
$.ajax({ url: "http://localhost:3000/get_messages", success: function(data){
//Update your dashboard gauge
chat.append(data);
}, dataType: "json", complete: poll, timeout: 30000 });
})();
});
The route:
match 'get_messages', to: 'real_time_notifs#get_messages', :via => :get
Here is the controller's method:
def get_messages
# sleep ??? will it stop the whole application?
render :json => ['message body']
end
I want that JavaScript will receive an answer only if there is something to display (for example, new message appeared in database table) without making a whole application to stop. Could you suggest how to organize get_messages method?
I need the solution which will not block the rest of application while waiting.
There are a number of ways to achieve this
Although I don't have huge experience, you should be thinking about it from another perspective (not just sending Ajax poll requests):
SSE's (Server Sent Events)
I'd recommend you use SSE's
The sent updates are not in the usual HTTP scope (uses its own mime type -- text/event-stream), which I believe means they are completely asynchronous (doesn't matter what you're doing in the app etc)
SSE's are basically done through the front-end by deploying a JS listener. This polls the server for any updates, but unlike Ajax, only listens for the text/event-stream mime):
var source = new EventSource("demo_sse.php");
source.onmessage = function(event) {
alert(event.data);
};
The efficient part is that you can then update this with ActionController::Live::SSE in Rails. I don't have any experience with this, but it basically allows you to send updates via the text/event-stream mime type
WebSockets
Websockets basically open a perpetual connection with your server, allowing you to receive content above the normal HTTP scope
My experience does not extend to "native" websockets (we've successfully used Pusher, and are working on our own websock implementation); but I can say that it's basically a more in-depth version of SSE's
You'll have to use JS to authenticate the client-server connection, and once connected, the browser will listen for updates. I'm not sure about the mime-type for this, but reading up on ActionController::Live will give you some insight into how it works
Either one of these methods will do as you need (only send / receive updates as they are available)

Is there a simple way to share session data stored in Redis between Rails and Node.js application?

I have a Rails 3.2 application that uses Redis as it's session store. Now I'm about to write a part of new functionality in Node.js, and I want to be able to share session information between the two apps.
What I can do manually is read the _session_id cookie, and then read from a Redis key named rack:session:session_id, but this looks kind of like a hack-ish solution.
Is there a better way to share sessions between Node.js and Rails?
I have done this but it does require making your own forks of things
Firstly you need to make the session key the same name. That's the easiest job.
Next I created a fork of the redis-store gem and modified where the marshalling. I need to talk json on both sides because finding a ruby style marshal module for javascript is not easy. The file where I alter marshalling
I also needed to replace the session middleware portion of connect. The hash that is created is very specific and doesn't match the one rails creates. I will need to leave this to you to work out because there might be a nicer way. I could have forked connect but instead I extracted a copy of connect > middleware > session out and required my own in.
You'll notice how the original adds in a base variable which aren't present in the rails version. Plus you need to handle the case when rails has created a session instead of node, that is what the generateCookie function does.
/***** ORIGINAL *****/
// session hashing function
store.hash = function(req, base) {
return crypto
.createHmac('sha256', secret)
.update(base + fingerprint(req))
.digest('base64')
.replace(/=*$/, '');
};
// generates the new session
store.generate = function(req){
var base = utils.uid(24);
var sessionID = base + '.' + store.hash(req, base);
req.sessionID = sessionID;
req.session = new Session(req);
req.session.cookie = new Cookie(cookie);
};
/***** MODIFIED *****/
// session hashing function
store.hash = function(req, base) {
return crypto
.createHmac('sha1', secret)
.update(base)
.digest('base64')
.replace(/=*$/, '');
};
// generates the new session
store.generate = function(req){
var base = utils.uid(24);
var sessionID = store.hash(req, base);
req.sessionID = sessionID;
req.session = new Session(req);
req.session.cookie = new Cookie(cookie);
};
// generate a new cookie for a pre-existing session from rails without session.cookie
// it must not be a Cookie object (it breaks the merging of cookies)
store.generateCookie = function(sess){
newBlankCookie = new Cookie(cookie);
sess.cookie = newBlankCookie.toJSON();
};
//... at the end of the session.js file
// populate req.session
} else {
if ('undefined' == typeof sess.cookie) store.generateCookie(sess);
store.createSession(req, sess);
next();
}
I hope this works for you. It took me quite a bit of digging around to make them talk the same.
I found an issue as well with flash messages being stored in json. Hopefully you don't find that one. Flash messages have a special object structure that json blows away when serializing. When the flash message is restored from the session you might not have a proper flash object. I needed to patch for this too.
This may be completely unhelpful if you're not planning on using this, but all of my session experience with node is through using Connect. You could use the connect session middlewhere and change the key id:
http://www.senchalabs.org/connect/session.html#session
and use this module to use redis as your session store:
https://github.com/visionmedia/connect-redis
I've never setup something like what your describing though, there may be some necessary hacking.

Resources