How to detect relay environment change if react-router-relay is used? - relayjs

Perhaps this is very specific question, but i believe it would be find to find solution. I find a lot of people ask about environment change after login/logout.
I'm using redux to store app state logic about login, modals, forms etc. and relay for data fetching. After login/logout I'm using connected router to trigger rerender with new environment
const RelayApp = connect(_ => ({
routes: routes,
environment: environment,
history: browserHistory,
render: applyRouterMiddleware(useRelay)
}))(Router);
ReactDOM.render(
<Provider store={store}>
<RelayApp />
</Provider>,
document.getElementById('root')
);
I've got my routes defined like this:
export default (renderAppRoute) => (
<Route render={renderAppRoute} path="/" component={App} queries={AppQueries} >
<IndexRoute component={UserList} queries={UserListQueries}/>
<Route path=":username" component={User} queries={UserQueries} />
</Route>
);
renderAppRoute function is defining what to render in different relay states, like if is fetching done
const renderAppRoute = ({ done, props, element }) => {
if(props) { //it's loaded
React.cloneElement(element, props);
}
return <h1>Loading...</h1>;
};
After i change env I'm getting error (index):7 Uncaught TypeError: Cannot read property 'user' of undefined(…). It's because on initial fetch, props is undefined and there's fetching in progress. After fetching is done in props you can find fragment definitions of RelayContainer etc. But after switch to new environment, props is never undefined, so it is rendering my component but without data in store. Is it possible to somehow detect or is there some way to identify if environment is new or do we need to create some custom logic on user side?
I'm currently making it with new variable isNewEnvironment which I set to true, when i create new environment, and false after renderAppRoute has params.done set to true

Related

Form Submit React and Rails

I am super new to React. Basically I have two components for now. My SearchInput-Component is a Select for cities, which does a real time search. It works super good and looks like this:
I added this component to my CityForm Component, which is a Search Form which I need to submit. And here I am stuck. For now I have just checked if submit works with a simple alert.
class CityForm extends React.Component {
handleSubmit(event) {
alert('Try tomorrow')
}
render() {
return (
<form onSubmit={this.handleSubmit}>
<SearchInput placeholder="search city" style={{ width: 200 }} />
</form>
);
}
}
ReactDOM.render(
<CityForm />,
document.getElementById('react-form')
);
I was thinking that submit should be kind of a post request. For now my URL won't change when I submit and I kind of need to do the post request to some kind of URL. The url for post would be smth. like
filter?[city_id_eq]=02000000&commit=Search
Am I thinking right? But also, I will need to get the city_id from the Select, which will basically always have a value before submit. Should set the value of the state of handleSubmit to the value of the select and how do I do this?
Maybe it's relevant, that's what I render in SearchInput
render() {
const options = this.state.data.map(d => <Option key={d.value}>{d.text}</Option>);
return (
<Select
showSearch
value={this.state.value}
placeholder={this.props.placeholder}
style={this.props.style}
defaultActiveFirstOption={false}
showArrow={false}
filterOption={false}
onSearch={this.handleSearch}
onChange={this.handleChange}
notFoundContent={null}
>
{options}
</Select>
);
First You create a new value state in Your CityForm component and remove the value state from SearchInput component. In other words, lift the state up. Afterwards, You can update CityForm new value state in SearchInput via callback function. Something in the lines of:
https://symfonycasts.com/screencast/reactjs/callback-props
Then You will have the city id available in CityForm component's state.
Second, use use, e.g., Axios to make the form post request.
https://www.digitalocean.com/community/tutorials/react-axios-react#step-3-%E2%80%94-making-a-post-request
P.S. Look into React hooks once You get a little more comfortable with React components. ;)

<Route>s match based on previous path when navigating using <Link>

When I navigate using a react-router <Link> to a page that has a <Switch> component on it with multiple <Route> s, the <Route> that ends up matching is one whose path I was previously on, rather than the current (new) path. It's as if the executes before react-router is aware that the path has changed. This causes the wrong component to render. Am I doing something wrong?
react-router-dom version: 4.2.2
Example:
I have a Header component that looks like this (simplified):
import { Route, Switch } from 'react-router-dom'
export default (props) => {
return (
<Switch>
<Route
path='/checkout'
render={(match, location) => {
console.log('path:', location.pathname)
return <CheckoutHeader>
}}
/>
<Route
path='/cart'
render={(match, location) => {
console.log('path:', location.pathname)
return <NormalHeader>
}}
/>
</Switch>
)
}
On my '/cart' page, I have this:
import { Link } from 'react-router-dom
export default (props) => {
return (
<Link to='/checkout'>Go to checkout</Link>
)
}
If I navigate using the URL bar to my /cart page, I see the Normal header and path: /cart is logged, which is correct.
But when I click the Link, the normal header remains, and path: /cart is logged again. I expect to see path: /checkout here, and the checkout header.
Refreshing the page fixes the problem, and replacing the React-Router link with a regular html link fixes it as well, but then of course we have to re-render the whole page and lose our redux store.

Swashbuckle's OAuth fails when response returned to o2c-html

My super simple sample Web API app is successfully rendering a swagger/ui document. But, it's failing if I try to configure it to do authorized OAuth 2.0 calls.
If you hit the icon to show the OAuth consent prompt, and then hit the Authorize button, I'm successfully redirected to login.microsoftonline.com and am able to successfully supply credentials, but, when I'm redirected back to swagger/ui/o2c-html, it seems that the dynamically generated javascript in that page is buggy...
Generated 'o2c-html' page:
<script>
var qp = null;
if(window.location.hash) {
qp = location.hash.substring(1);
}
else {
qp = location.search.substring(1);
}
qp = qp ? JSON.parse('{"' + qp.replace(/&/g, '","').replace(/=/g,'":"') + '"}',
function(key, value) {
return key===""?value:decodeURIComponent(value) }
):{}
if (window.opener.swaggerUi.tokenUrl)
window.opener.processOAuthCode(qp);
else
window.opener.onOAuthComplete(qp);
window.close();
</script>
When I hover over the qp variable, I can see that I successfully retrieved the desired bearer token, but, as execution proceeds down to the final conditional statements with window.opener everything fails at window.opener.swaggerui
I get the following error:
JavaScript runtime error: Unable to get property 'swaggerUi' of
undefined or null reference occurred
Blockquote
I'm using these NuGet packages, FWIW:
<package id="Swashbuckle" version="5.5.3" targetFramework="net45" />
UPDATE - FIGURED OUT WHAT I DID WRONG
In my case, I did not have the right configuration in SwaggerConfig.cs , in the first section for EnableSwagger(c =>:
I initially had it set like this (don't ask me why):
c.OAuth2("oauth2")
.Description("OAuth2 Implicit Grant")
.Flow("implicit")
.AuthorizationUrl(
string.Format("https://login.microsoftonline.com/{0}/oauth2/authorize",
ConfigurationManager.AppSettings["ida:Tenant"]))
.Scopes(scopes =>
{
scopes.Add("user_impersonation", "Access swagger");
});
But now it all works when I changed it to an exhaustive listing of my actual roles that I use in my controllers...
c.OAuth2("oauth2")
.Description("OAuth2 Implicit Grant")
.Flow("implicit")
.AuthorizationUrl(
string.Format("https://login.microsoftonline.com/{0}/oauth2/authorize",
ConfigurationManager.AppSettings["ida:Tenant"]))
.Scopes(scopes =>
{
scopes.Add("AdminAccess", "Admin access to protected resources");
scopes.Add("FullAccess", "Full access to protected resources");
scopes.Add("UpdateAccess", "Update access to protected resources");
scopes.Add("ReadAcces", "Read access to protected resources");
});
Sorry that is a bug in the Swagger-UI.
Latest version looks different, take a look here: src/main/html/o2c.html
We just have to publish a new NuGet package of Swashbuckle (hopefully soon)

Electron app: How do you use ipcRenderer.sendToHost()?

In the Electron documentation for the webview tag, the following example is given to show how to communicate between the renderer process and the web page hosted in the webview:
With sendToHost method and ipc-message event you can easily communicate between guest page and embedder page:
// In embedder page.
const webview = document.getElementById('foo')
webview.addEventListener('ipc-message', (event) => {
console.log(event.channel)
// Prints "pong"
})
webview.send('ping')
// In guest page.
const {ipcRenderer} = require('electron')
ipcRenderer.on('ping', () => {
ipcRenderer.sendToHost('pong')
})
However, in my guest web page (inside the webview), I get Uncaught ReferenceError: require is not defined when I try to require('electron'), as indicated in the docs.
Is there something else I need to do to be able to require the ipcRenderer module from the guest web page?
Electron version: 1.4.6
Note: I'm not sure if this is important or not, but the page running inside my webview is served from a local server. In my top-level page in the renderer process, I do something like: document.getElementById("webview").src = "http://localhost:1234/...";.
Edit: It looks like serving my web page from a local server does not change anything. I have the same error after trying with a static HTML file. It looks like the example in the docs simply doesn't work, or I'm understanding it wrong.
// Simple foo.html somewhere on my computer
<script>
const {ipcRenderer} = require('electron')
ipcRenderer.on('ping', () => {
ipcRenderer.sendToHost('pong')
})
</script>
// In embedder page, in renderer process
document.getElementById("webview").src = "file://path/to/foo.html";
Output from the embedded page (inside the webview):
Uncaught ReferenceError: require is not defined
EDIT
For security reasons, the preferred way to use require in renderer processes is to use preload to inject only the minimum node integration your page requires. See point 2) of Electron's security recommendations. A minimal example for ipcRenderer:
// main.ts
const mainWindow = new BrowserWindow({
webPreferences: {
nodeIntegration: false,
preload: './preload.js'
}
})
mainWindow.loadURL('https://my-website.com')
// preload.js
const { ipcRenderer } = require('electron')
window.sendToElectron= function (channel) {
ipcRenderer.send(channel)
}
In your webpage you can now use window.sendToElectron("ping").
If you're using <webview> inside the renderer process, you can use <webview src="page.html" preload="./preload.js" /> to achieve the same result. So, that's what I would use to answer my original question, and inside preload.js I would inject a function that calls ipcRenderer.sendToHost("pong") in the global window.
Old answer (bad for security)
I had missed a vital point in the webview docs. To be able to call require from the page embedded inside the webview, you need to set the nodeintegration attribute on the webview tag:
<webview id="webview" nodeintegration />

How to temporarily handle routing when migrating a rails app to react?

I'm getting ready to migrate a monolithic rails app to a react app with a rails backend and json api. For now we've integrated the react application as a static asset in our rails app, and are slowly transitioning every page to be rendered by react.
Problem is, that the react app doesn't do the routing (because the rails app handles it currently). Only when all pages have been transferred do we want to transition the frontend completely to react.
However, the react app shouldn't render the same content on every page of course. It should render the appropriate content, based on the page on which it is initiated (and it would reinitiate for every pageload, but the script for the app itself would remain the same).
So the question is, what would be the recommended way to enable react to render the appropriate content. Does it make sense to use react-router, and use it only to render the content based on the url, but not have it handle links?
So in your route file:
export default function(requirePlanData) {
return (
<Route path="/">
<Route path="(:lang/)reading-plans" component={PlansView}>
<Route path=":id(-:slug)" component={AboutPlanView} onEnter={requirePlanData} />
</Route>
)
}
Here you can see that on render of '/en/reading-plans/903' the 'AboutPlanView' component will be rendered, and the 'requirePlanData' function will be called to populate the app state required for the component view.
In your main js file with your requirePlanData you can do call your action creators for your API calls and populate state with your reducers before the view is loaded:
function requirePlanData(nextState, replace, callback) {
const { params } = nextState
var idNum = parseInt(params.id.toString().split("-")[0])
const currentState = store.getState()
if (currentState && currentState.plansDiscovery && currentState.plansDiscovery.plans && currentState.plansDiscovery.plans.id === idNum) {
callback()
} else if (idNum > 0) {
store.dispatch(PlanDiscoveryActionCreators.readingplanInfo({ id: idNum, language_tag: window.__LOCALE__.planLocale }, store.getState().auth.isLoggedIn)).then((event) => {
callback()
}, (error) => {
callback()
})
} else {
callback()
}
}
Now your action creators will return, fire off your reducer to populate state, and your component will be able to render with all the required data–based off of the url/route you defined.
Does this help you?

Resources