Read-only & non-realtime mode activated to improve browser performance
Message pops up in my project and I'm unable to delete the nodes as well
Also I read this https://groups.google.com/forum/#!topic/firebase-talk/qLxZCI8i47s
Which states :
If you have a lot of nodes in your Firebase (say thousands), we need to create a new element for each node and modern browsers simply have limitations of how many DOM elements you can add to a page
It says:
To resolve this problem, don't load your Firebase Dashboard at the root of your Firebase, but instead load it lower down in the hierarchy
I do not get what it means
How do I get back to my Realtime Dashboard?
If you want to delete a high level node when this is activated, I recommend doing this.
Open up a text editor and type in { }. Save this file as "blankJSON.json".
Go to high level node you want deleted and select it, once it opens up and shows you all the nodes that need to be removed, select the three bars at the top right and select "Import JSON", (It would be safe to first "Export JSON" If you don't have backups, in case you make a mistake here). Import the JSON file we created earlier titled "blankJSON".
This will delete all of the data inside.
Once again, I highly suggest you make a backup before doing this, It's extremely easy to make a backup and also it is much easier than you would think to upload this blankJSON to the wrong node and then erasing a bunch of important data.
When it detects that it's downloading too many nodes from your database, the Firebase Console stops using real-time mode and switches to read-only mode. In this mode it requires less work from the browser, so it is more likely that the browser will stay performant.
To get back to realtime mode, you have to go to a location that has fewer nodes. So say that you start loading the database at the root, that means that "pseudo address bar" at the top of the data tree will say:
https://<your-project>.firebaseio.com/
And then will show the list of items. Now click on the URL in that pseudo address bar and change it to:
https://<your-project>.firebaseio.com/<one-of-your-keys>
And hit enter. The data tree will reload with just the node from one-of-your-keys and down and will likely switch to realtime mode again.
Every node key in firebase is a link, you can open a sub-node in a new tab and then edit that sub-node and its children.
Right click on a sub-node you want to edit or delete
Select open link in a new tab
Edit the sub-node in the new tab
1) Click on the Node you want to mass delete
2) Import an empty .json file (just containing curly braces, {} )
3) The node value will be set to null, in other words it is deleted or rather overridden with an empty node!
What you can do is to have an OnClickListener and call the remove value method to your DatabaseReference, like this:
mCart.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
mDatabaseReference.removeValue();
}
});
I have the same problem... I'm a bit surprised because I though Firebase could easily scale to support huge amount of data (example million users, etc.).
I have a node with 80000 sub-nodes (each object has his own push-id) and I cannot delete or perform any action on it because the real-time doesn't work in Firebase console.
I think the only way to udate or delete the data it's to do it via JAVA code :(
Multiple times trying to load the specific keys can be tiresome. There is a python library that could do this for you easily.
http://ozgur.github.io/python-firebase/
I needed to delete a lot of keys and this helped me do that in one go.
What I do is export the entire tree, edit/add the node I want using an editor, then import the JSON and overwrite the previous node/tree. Problem solved! Risky though 😁
Related
I have a requirement that states to have a Menu Screen containing 10 options and user can select a option and jump to appropriate screen.I have created a Trans-ID for Menu Screen.Do i need to create Trans-ID for all the 10 options?.I have searched for this type of Requirement and all of them involves creating the Trans-ID for each sub screen so that the screen can be refreshed and return to same screen until user selects to go back to main-screen.
I am new to CICS-COBOL Programming and not sure why we need to create Trans-ID for each screen.Is this the global format or is there any other approach available?
No, you don't need to use a tranid per screen/function in this scenario. You could actually use one transid and even one program in a pseudoconversational style.
You would use a commarea to hold the state of the interaction with the user at the terminal, so when the user picks an option and the next 'leg' of the pseudoconversation invokes the transaction and program again, you can determine in that program what has just been received from the terminal, what to do with it and what response to send back to the terminal. This process simply repeats until the business function completes and you can end with the default menu again.
I'm trying to set the Visitor ID in Adobe Analytics through DTM.
Above the s_code I have:
var visitor = new Visitor("xxxx")
visitor.trackingServer = "xxx.xx.xx.omtrdc.net"
I've created a data element where the legacy code used to call the
Visitor.getInstance("xxxx");
and set the Visitor ID to %Visitor ID%
That's not working however, and my visitor ID is always just set to %Visitor ID% and obviously not reading any values. I'd really appreciate any input that someone can give me.
Thanks,
Mike
The Visitor ID pops s.visitorID and is in general related to visitor id, but is not the same as s.visitor which is what gets popped for the VisitorAPI integration. DTM does not currently have a built-in field for the s.visitor variable, so you will have to set it yourself within the config, either in the Library Management code editor (assuming you are opting to c/p the core lib and not the "Managed by Adobe" option) or else in the Custom Page Code section.
Since you are popping it in a data layer first, you can reference the data layer like this:
s.visitor = _satellite.getVar('Visitor ID');
NOTE: A separate potential issue you may have is with whether or not the Visitor object is available for your data element. Since data elements are the first thing to be evaluated by DTM, you will need to ensure that the VisitorAPI.js library is output before your top page DTM script include.
If this is a problem for you, or if you are wanting to host VisitorAPI.js within DTM, then you may need to adjust where you are popping that stuff. For example, place the VisitorAPI core code above the custom code as the first stuff within the data element, before:
var visitor = new Visitor("xxxx") visitor.trackingServer = "xxx.xx.xx.omtrdc.net
Or, don't use the data element at all. Instead, put the VisitorAPI code within the Adobe Analytics custom code or core lib section and pop all that stuff (aboove the s.visitor assignment). Or a number of other methods; point is, VisitorAPI stuff must be loaded before the data element can make use of it, same as it must be loaded before Adobe Analytics can make use of it.
So DTM is changing pretty fast and furious right now. They have a "Marketing Cloud Service ID" that works well. Before I used that, however, I did find a way to fix the code. Crayon Violent was right, as usual, that the problem was that the script wasn't available yet. I fixed this by putting the following code in between the VisitorAPI.js and the AppMeasurement stuff in the DTM managed library.
var aA = new AppMeasurement();
aA.visitorNamespace="companyname";
aA.visitor = Visitor.getInstance("companyname");
In addition, there were also some issues using my localhost for testing while trying to see if I had this correct or not. If you are having issues and think you have it correct, it may be worthwhile to elevate it to a different environment.
I have a view data source that uses a view key to access documents and show them in a repeat with var "posts". within the repeat I have a document data source with var "post" that gets's the the unid of the documents using posts.getUniversalID().
further down the repeat I have another document data source "newcomment" that is a response and take the parent id as: post.getDocument().getUniversalID()
below the newcomment data source I have an editbox and a submit button which saves the comment as a response to the "post" using newcomment.save()
Here is my problem
two people access the same xpage. personA enters the page and starts writing a comment to a post. in the same time personB creates a new post and submit it before personA submits the comment. What happens now is that the comments gets binded to the latest post and not to the post personA responded to.
I tried anothoher thing also, let's say there is 10 posts in that database. personA and personB access the xpages. personA start writing a comment to post number 8. at the same time personB creates two new posts in the database. when personA now submits the comment it seem to get bind to the same index which is now two posts up. but still index 8. which is ofcourse the wrong post.
If I change the repeat to "createControlsAtPageCreation" ie.e repeatControls=true the comment is attached to the correct post but then I run into another problem that the view is not updated to show the latest posts.
my repeat is wihtin a custom control that is loaded dynamically using the dynamic content control in extlib.
As information here is what I have found about the repeatControls settings
Setting the repeatControls property to true instructs the repeat control to create a new copy of its children for each iteration over the dataset.
When the Repeat control is configured with the property
repeatControls=“true” , it repeats its contents only once, at page load time
So my question here is that I do not understand what is going on. why is my comment attached to the wrong parent document? and is there a way I can prevent this and still have new posts displayed correctly
thanks for your help
Without the code it's a bit hard to imagine what exactly is going one here but this looks very similar to problem that I had with repeat control and value binding.
Long story short the problem was connecet to repeatControls property set to false. When it was like that data binding were working only for first element in collection - all data was somehow magically saved to this first object! I managed to get this working by using combination of dynamic content control rebuild and repeatControls set to true. Only then databindings were working property.
It seems like if You are repeating rendering only (and this is what repeatControls set to false do) the decoding phase of jsf lifecycle goes foobar.
Without your XSP markup, it's difficult to be absolutely definitive but it appears that you're app code is creating and persisting the datasources and components per row during page load - therefore increasing the overall size and complexity of the component tree also. You should alternatively try an approach that will lazy-load the datasource only when requested by the end-user (eg: edit / reply).
Refer to the XPages Extension Library demo application (XPagesExt.nsf) for examples that use such a dynamic approach. In particular, look at Core_InPlaceForm.xsp which demonstrates using the xe:inPlaceForm control within a xp:repeat. And also see Domino_ForumView.xsp which demonstrates using the xe:forumView and xe:forumPost controls to manage and visualize a hierarchical thread. Also consider the concurrency mode that best suits your requirements when it actually comes to saving any given post or comment (fail, createConflict, force, exception) and document locking for high contention situations. The above-mentioned controls all provide the highest level of dynamic control and datasource creation and destruction.
Please feel free to send me on a worked example database, where I can understand your exact use case - DM me or email me.
This is one of those just-making-sure-I-didn't-miss-anything posts.
I have a TKinter GUI in Python 2.7.3 that includes a listbox, and there are circumstances where I'd like to directly modify the text of a specific item at a known index. I've scoured the documents and there's no lb.itemset() method or anything like it. As best I can tell I have two options, either of which would work but just seem kind of klunky to me:
lb.delete() the old item and lb.insert() the new value for it at the same index (including a step to re-select the new value if the old deleted one happened to be selected).
Create a listvariable for the listbox, then use get() and set() on it -- with a pile of replace/split/join acrobatics in between to handle the differing string formats involved.
Is there some simpler, more direct way to do it that I'm missing? Or have I turned up all the available options?
Assuming from silence that there's nothing I missed. I went with option 2 -- the acrobatics weren't quite as complex as I'd thought. I just created a behind-the-scenes list wrapped up in a class; every time I update the list, the class syncs up the content of the listbox by doing a ' '.join on the list then setting the listbox's listvariable to the resulting string.
I'm managing a Filemaker Pro 12 solution to create purchase orders and send them via email for the iPad but the iOS Platform doesn't allow to export Excel files.
Currently we are sending the orders as a .pdf file but the warehouse has to manually approve each order. The solution we were thinking of is to send a .csv file attached to an email so that the warehouse has less work.
I'm using 3 different tables to create a order. A table that stores the products, a container table (to set the quantity, discount etc.) and a table for the order itself. So you can for example create three product-A and two Product-B in an order which works fine. My PDF solution uses an extra Layout and works fine. On a PC or Mac you can export Excel files with a script and the "Save Records as Excel" function, but this is not available on iOS, so I'm looking for alternatives.
The script I came up with to send the .csv file via email looks like this:
Set Error Capture [On]
Freeze Window
Go to Related Record [Show only related records; From table "Orders_Container"; Using Layout: "MailLayout" (Orders_Container); New Window]
If( Get (LastError) = 0)
Sort Records[ Restore; No dialog]
Set Variable[ $FILE; Value:Get (TemporaryPath ) & "file.csv" ]
Export Records [No dialog; "$FILE"; Windows(ANSI)]
Send Mail[Send via E-mail Client; To: "customer#web.com"; Subject: "Order"; "$FILE"]
End If
Close Windows [Current Window]
I tried to use the function "Export Records" but it exports all records of the current customer instead of the 'Records being browsed' which is only available for the PDF export. I've tried to do this but on iOS you can't create a new file to export and I can't come up with something else. I don't want the users to use Menu -> Export -> E-Mail because they will have to enter 3-4 Email addresses each time which is way to time consuming (this also exports all records instead of the ones which are currently being browsed). Is there any way to export the records that are currently open, save those in a .csv file and send a mail with the file attached? Thanks in advance.
EDIT: Ironically after some testing with different scripts, the script I posted here works now correctly.
The Export Records function exports the current found set - the same as Records Being Browsed. This indicates that other issues are keeping your script from working.
You might want to look at the Related Record and which set is found. Use a temporarily pause just before the Import Records function is called to see what set is present.
Also, keep in mind that if there is an error with your Go to Related Record call, nothing happens. A record created by a previous export would then remain in place and this then might show a different set of records than you would expect.