graphenedb_url connection error - neo4j

My graphebedb_url is gotten from heroku to access my neo4j database online. It is correct but when I initiate db connection. It returns error 403.which is forbidden request.

I'm founder & CEO of GrapheneDB. philippkueng/node-neo4j supports authentication via URL.
According to the project's readme, the snippet should look like this. I've adjusted it to load the connection URI from the env variable:
var neo4j = require('node-neo4j');
db = new neo4j(process.env['GRAPHENEDB_URL']);
Attention: The latests release of the driver is 9 months old, so it might not be compatible with the latests versions of Neo4j. This is not related to your authentication issue though.
For an up-to-date nodejs driver I'd recommend thingdom/node-neo4j

Can you describe what you've tried?
Perhaps you need the username and password? Your driver might not support the username and password as part of the URL. You might need to specify it separately (keep in mind there are two node-neo4j drivers when looking at documentation)
Also, ideally you should be using the Heroku environment variable rather than hardcoding the URL.

Related

QuickBooks RDS client (or maybe server) is throwing Public key exchange error: Object already exists

I am working with intuit SDK tech support on this without much luck. I was hoping someone here might know what object this error might be referring to.
I am basically sending a request from one computer to QuickBooks on another machine via PowerShell and Remote Data Sharing (RDS) Client/Server provided in the QuickBooks SDK. The relative portion of the Powershell script looks something like this..
$myQBXMLRP = New-Object -com QBXMLRP2.RequestProcessor
$myQBXMLRP.OpenConnection2("qb4D","CCFolioPro",2)
The first line instantiates the COM object QBXMLRP2.RequestProcessor. The second line opens a connection with QuickBooks. RDS Client on the local machine receives the OpenConnection request and passes it on to the RDS Server on the machine where QB resides which in turn opens the connection with QB. The second line is throwing the following error...
Exception calling "OpenConnection2" with "3" argument(s): "Public key exchange error: Object already exists
Reading here in stackoverflow and elsewhere via google I see that this error occurs to other programmers not dealing in any way with QuickBooks, so I am hoping somone here might be able to help me figure out how to fix the problem.
I have QB/RDS working fine prior to this installation. So I know it should work as is. Something is hung up on this computer, the server computer, or ????
Thanks,
John
Your mileage may vary. I cleared the following file, reboot and reinstall RDS then things are back to normal. It was on a Windows Server 2012 but you get the idea. You need to change your view options so these folders are not hidden. Worths a shot, and I hope this helps.
Clear one of the key files created by RDS by prior runs or installations

DSE OpsCenter best practice fails when Cassandra PasswordAuthenticator is used

The following best practice checks fail when Cassandra's PasswordAuthenticator is enabled:
Search nodes enabled with bad autocommit
Search nodes enabled with query result cache
Search nodes with bad filter cache
My values are in compliance with the recommended values; and I have confirmed that the checks indeed pass when I disable authentication in Cassandra. What's weird is that there are 6 checks under the "Solr Advisor" category of the Best Practice Service and only these 3 are failing when authentication is enabled.
Is this a known bug in Opscenter? I'm using v5.0.1 but I've seen this since v5.0.0.
Where can I file bug reports like this? Does Datastax have a public bug tracker?
PS:
I actually feel that this question is more appropriate under ServerFault but I don't have enough reputation in that site to create the tags "datastax" and "datastax-enterprise". Can somebody do so please and move this question?
When Cassandra is using PasswordAuthenticator, then the http routes that opscenter agent uses to determine the solr schema settings also become password protected (however the agent does not try the password properly). This is a bug in the opscenter agent, and can be referenced as OPSC-3605.
Unfortunately Datastax Enterprise does not have a public bug tracker. If you're a DSE customer, probably the best method you can use is to go through DSE support.

Running a PHP script on email arrival in an IMAP Server

I'm trying to implement a webmail in PHP. I would like to write a PHP CLI script which is run on every email arrival to store some parts of (not all of) incoming email into database for search purposes. Then when the user finished searching and chose an email to show, a connection is made to mail server to retrieve the complete email. In order to implement this scenario I need to make some sort of connection among emails within database and mail server.
Since my knowledge of working with mail servers is limited to Zend Framework's API, what I believe I need in order to retrieve an email from an IMAP server is a message number or a message unique id (this later one seems not to be supported by all mail servers).
To this point, I've managed to find .forward (and some other ways) to introduce my PHP CLI script to MTAs to be run on every email arrival. This way I can store emails to database. But this won't do since message unique id is created by MDA so MTA do not know of it and they can not provide it to me. This means I can not find emails later when I want to retrieve them from mail server.
At last, here's my question: Is there a way to introduce a PHP CLI script to a MDA for emails' arrival? If this is dependent on the mail server, which servers do support this and how? My personal choice would be Dovecot or Courier, but any other mail server would do as well.
This is tricky -- there are many ways on how to setup delivery. Some of them work with the underlying mail store directly, bypassing your IMAP server altogether, while others use e.g. Dovecot's facilities.
Have you considered building on top of the notify plugin which ships with Dovecot?
It seems like it's impossible to introduce such a PHP CLI script to IMAP server (at least I'm sure of Dovecot). Anyway, the work around I found for this problem is to use my own PHP script to insert the new mails into IMAP server and retrieve their id's and then store the id in database for future references. To be clear, email are given to my PHP CLI script by MTA, not MDA. As I said before this is done easily using .forward file.
[UPDATE]
Unfortunately it seems this solution can not be implemented as well. The way to insert a new email to IMAP server is APPEND command, and to have the UID of the recently added mail server must support UIDPLUS extension. Neither Dovecot nor Courier supports this extension at the moment! If they did it seems the server would return the UID with a APPENDUID response.
[UPDATE]
It is my bad since Courier does support UIDPLUS. So this solution is valid and the one I'm going to implement.

changing gerrit's canonical web url

I have had an issue with setting up my gerrit server. The machine has Ubuntu 12.04 LTS Server 64-bit installed on it. I am setting up git and gerrit as a way to manage source code and code review.
I require internal and external access to it. I setup a DNS that would work externally. However, during the initial setup, i left the canonicalWebUrl to its default value. It usually take's the machine's hostname (in this case it was vmserver).
The issue I was running into is exactly as explained here https://stackoverflow.com/questions/14702198/the-requested-url-openid-was-not-found-on-this-server, where after trying to sign in/register account with OPEN ID, it was saying url not found.
For some reason, it was changing the url in the address bar from the the DNS i setup to the CanonicalWebURL.
I tried to change the canonical web url in the gerrit.conf file found in etc of the gerrit site. After restarting the server, however, we were able to see the git project files present as they should be, but the account that was administrator seemed to no longer be registered and none of the projects were visible through gerrit.
I was wondering if there was a special procedure to changing the canonical web url in gerrit without disrupting access to a server?
any help or information on canonical urls would be much appreciated as i cannot find too much information on them.
edit:
looking deeper, i found some information that is way over my head regarding "submodules"
i do not understand if this is what i am looking for or not.
https://gerrit-review.googlesource.com/#/c/36190/
The canonical web url must be set, and it sounds like you have done that correctly.
I suspect the issue you are seeing is caused by changing the canonical web url - some OpenID providers (Google being the big one) will return a different user ID based on the URL of the request. This is a privacy thing and cannot be changed. So previous users will now show up as new users and won't be in their old groups (Administrators group in this case).
If you don't have many users, it might be easiest to migrate them by hand. You can modify the database to map the new user ID to the old user account.

is there any way to find out the server from an ActiveRecord connection?

I'm using AR with SQLServer adapter on Rails2/linux. On my local env, I can easily change freetds and odbc.ini files to trace back and find out the connection information. But in test envs, this information constantly changes and gets out of sync, so I'm trying to put it in our logging as well so we can troubleshoot more easily.
Yes, I know TinyTDS does this better, we are moving to that, but not quite there yet.
I can do:
ActiveRecord::Base.connection.current_database
But can't find anything similar for getting the server address or ip.
i can't remember if there is a public API for this, but you can get the config of AR like this
ActiveRecord::Base.connection.instance_variable_get '#config'
this returns the config hash, which includes the host

Resources