My case is as follows:
I have a MySQL database which I exported and inserted in Fusion Table. Now I need to modify the php page in querying the data from the Fusion Table instead of the localhost db.
I've read about querying Fusion-Tables from the Developers guide, but it refers to a GData Java Library and the SQL API. On another site I saw there were other libraries that can be used for querying, including Zend framework where PHP relies on, which is relevant in my case. However, I haven't stumbled upon any sample or tutorial which simply shows a php page on how you can query a simple cell, row or column from where I can just modify and improve the coding to apply it on the design.
My questions are:
Do I need to install/upload on the hosting space some library?
Is there a php sample & tutorial on querying Fusion Table data, I might have missed?
Can anyone provide me with a clue how you would query for example the Latitude info of the row with ID 5? (pages are retrieved using the path: articles.php?lang=en&pg=comm_en&m=view&commID=88
Below is the coding I have in comm_en.php page, which queries from PHP to SQL the data.
Thank you in advance!
Drini
<?php
session_start();
foreach ($_REQUEST as $key => $value){
$$key=addslashes(htmlspecialchars(strip_tags($value)));
}
if(isset($_GET["m"])) {$m=$_GET["m"];} else {$m="";}
if(isset($_GET["commID"])) {$commID=$_GET["commID"];} else {$commID=1;}
if(isset($_GET["lang"])) {$lang=$_GET["lang"];}
else {$lang="en";}
Shfaq($m,$commID);
function Shfaq($metod,$commID)
{
switch($metod)
{
case "view":
{View($commID);}
break;
case "default":
{View($artID);}
break;
}
} // end of Shfaq();
function View($commID)
{
$link =mysql_connect("localhost", "db-user", "db-pass") or die ("E pamundur lidhja!");
mysql_select_db("DataBase-name") or die (mysql_error());
$queryComm = "SELECT * FROM communities WHERE id ='$commID' LIMIT 0,1";
$commRes=mysql_query($queryComm) or die(mysql_error());
$comm=mysql_fetch_array($commRes);
$healthPerc = round(($comm["healthBooklet"]/$comm["totalChildNo"]), 3)*100 ;
echo ' <table class="gpsbox" align="right">
<caption>GPS Location</caption>
<tr><td>Latitude </td><td>'.$comm["latitude"].'</td></tr>
<tr><td>Longitude</td><td>'.$comm["longitude"].'</td></tr>
</table>....<html coding continues>
Here is how to use the newest PHP API client library with Fusion Tables:
use Google_Service_Fusiontables;
$client_email = '<id>#developer.gserviceaccount.com';
$private_key = file_get_contents('<private key>.p12');
$scopes= array(
'https://www.googleapis.com/auth/fusiontables',
'https://www.googleapis.com/auth/fusiontables.readonly',
);
$credentials = new Google_Auth_AssertionCredentials(
$client_email,
$scopes,
$private_key
);
$client = new Google_Client();
$client->setAssertionCredentials($credentials);
if ($client->getAuth()->isAccessTokenExpired()) {
$client->getAuth()->refreshTokenWithAssertion();
}
$service = new Google_Service_Fusiontables($client);
$result = $service->query->sql('SELECT <column> FROM <table id>');
var_dump($result);
There is a PHP client library, you can use it if you want (it's probably the easiest way to access Fusion Tables from PHP)
Next to the client library there is also PHP sample code, just take a look, it probably helps you to understand the concept.
When you use the library, you can simply write sql statements, i.e. something like
select latitude from 123456 where id = 5;
123456 refers to the table id of your fusion table. If your table is public, you don't even have to care about authentication, but if you want to access private tables or update/insert data, I'd recommend to use OAuth for authentication.
Sample code in general can be found on the Google Fusion Tables API Sample Code
Related
I am trying to replicate an app in Angular Material that I have had running for a couple years with HTTP/JavaScript etc. The app is what could be called a database type app with CRUD operations on a MySQL database on a remote server which uses PHP between the App and the Database.
Several of the tables which are returned from the server in JSON are range limited by a parameter sent from the JavaScript, and then received by the PHP, and used in the where clause of the Select statement like this code snippet:
$registration = $_POST['REGISTRATION'] ;
$query = "SELECT sysid, REGISTRATION, TYPE, COMPONENT, SERIAL, PART_NUMBER FROM status_hours WHERE REGISTRATION='$registration' ";
This HTTP/JavaScript/PHP code is working perfectly, however I would like to replicate the app in Angular Material.
With Angular Material I can retrieve recors in JSON perfectly, but the problem is that I can not send a parameter and get it working so that it can be used in the WHERE clause in the PHP.
Here is the Type Script code on the app which gets a table of rows from the server.
const url = 'http://mydatabase.com/angular/php/status_hours.php' ;
const options = { params: new HttpParams().set('REGISTRATION', this.REGISTRATION) } ;
this.http.get<Hours[]> (url, options)
.subscribe(hours =>
{
console.log(hours) ;
this.dataSource.data = hours ;
} ) ;
Here is the PHP code I have tried, to receive the parameter which is called REGISTRATION so that it can be used in the WHERE statement.
$postdata = file_get_contents("php://input");
$request = json_decode($postdata);
$registration = $request->REGISTRATION;
The problem is either the parameter is not being sent by the Angular app, or it is not being received by the PHP so that it can be used in the WHERE clause.
I can hardcode for sumulation purposes, $registration = 'hardCodedData' ; and it will return valid JSON data.
Thank-you for your assistance.
Here is the answer to my question about proper procedure to pass parameters to PHP from Angular/TypeScript.
The PHP code must contain $registration = $_GET['REGISTRATION'] ; not $registration = $_POST['REGISTRATION'] ; which was working for me with HTML/JavaScript, and which I was using from published example code. Note, REGISTRATION is just the name of the parameter which was being sent by Angular.
Thanks for the assistance everyone!
Here is a link which I have used to enable me to pass multiple parameters. The same PHP code applies of course.
Angular 4 HttpClient Query Parameters
My inexperience has left me short of understanding how to hide an API Key. Sorry, but I've been away from web development for 15 years as I specialized in relational databases, and a lot has changed.
I've read a ton of articles, but don't understand how to take advantage of them. I want to put my YouTube API key(s) on the server, but have the client able to use them w/o exposure. I don't understand how setting an API Key on my server (ISP provided) enables the client to access the YouTube channel associated with the project. Can someone explain this to me?
I am not sure what you want to do but for a project I worked on I needed to get a specific playlist from YouTube and make the contents public to the visitors of the website.
What I did is a sort of proxy. I set up a php file contains the api key, and then have the end user get the YT content through this php file.
The php file gets the content form YT using curl.
I hope it helps.
EDIT 1
The way to hide the key is to put it in a PHP file on the server.
This PHP file will the one connecting to youtube and retrieving the data you want on your client page.
This example of code, with the correct api key and correct playlist id will get a json file with the 10 first tracks of the play list.
The $resp will have the json data. To extract it, it has to be decoded for example into an associative array. Once in the array it can be easily mixed in to the html that will be rendered on the client browser.
<?php
$apiKey = "AIza...";
$results = "10";
$playList = "PL0WeB6UKDIHRyXXXXXXXXXX...";
$request = "https://www.googleapis.com/youtube/v3/playlistItems?part=id,contentDetails,snippet&maxResults=" . $results .
"&fields=items(contentDetails%2FvideoId%2Cid%2Csnippet(position%2CpublishedAt%2Cthumbnails%2Fdefault%2Ctitle))" .
"&playlistId=" . $playList .
"&key=" . $apiKey;
$curl = curl_init();
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => true,
CURLOPT_URL => $request,
CURLOPT_SSL_VERIFYPEER => false
));
$resp = curl_exec($curl);
if (curl_errno($curl)) {
$status = "CURL_ERROR";
}else{
// check the HTTP status code of the request
$resultStatus = curl_getinfo($curl, CURLINFO_HTTP_CODE);
if ($resultStatus == 200) {
$status = "OK";
//Do something with the $resp which is in JSON format.
//Like decoding it into an associative array
} else {
$status = "YT_ERROR";
}
}
curl_close($curl);
?>
<html>
<!-- your html here -->
</html>
Note: CURLOPT_SSL_VERIFYPEER is set to false. This is in development. For prod it should be true.
Also note that using the api this way, you can restrict the calls to your api key bounding them to your domain. You do that in the googla api console. (Tip for production)
I have a query that runs and can see the results. But while trying to save the query as a view table, I get error message saying
Failed to save view. No suitable credentials found to access Google
Drive. Contact the table owner for assistance.
I think the problem is caused by a table used in the query. The table is uploaded from a google sheet (with source URI), own by me. I have tried to enable Google Drive API from the project but no luck. Not sure how I can give BigQuery access to Google Drive.
I suspect the problem you are hitting is one of OAuth Scopes. In order to talk to the Google Drive API to read data, you need to use credentials that were granted access to that API.
If you are using the BigQuery web UI and have not explicitly granted access to Drive, it won't work. For example, the first time I tried to "Save to Google Sheets", the BigQuery UI popped up an OAuth prompt asking me to grant access to my Google Drive. After this it could save the results. Try doing this to make sure your credentials have the Drive scope and then "Save View" again.
If you are using your own code to do this, you should request scope 'https://www.googleapis.com/auth/drive' in addition to the 'https://www.googleapis.com/auth/bigquery' scope you are already using to talk to BigQuery.
If you are using the bq client, it has been updated to request this scope, but you may need to re-initialize your authentication credentials. You can do this with bq init --delete_credentials to remove the credentials, then your next action we re-request credentials.
Using Google App Script this worked for me:
function saveQueryToTable() {
var projectId = '...yourprojectid goes here...';
var datasetId = '...yourdatesetid goes here...';
var sourceTable = '...your table or view goes here...';
var destTable = '...destination table goes here...';
var myQuery;
//just a random call to activate the Drive API scope
var test = Drive.Properties.list('...drive file id goes here...')
//list all tables for the particular dataset
var tableList = BigQuery.Tables.list(projectId, datasetId).getTables();
//if the table exist, delete it
for (var i = 0; i < tableList.length; i++) {
if (tableList[i].tableReference.tableId == destTable) {
BigQuery.Tables.remove(projectId, datasetId, destTable);
Logger.log("DELETED: " + destTable);
}
};
myQuery = 'SELECT * FROM [PROJECTID:DATASETID.TABLEID];'
.replace('PROJECTID',projectId)
.replace('DATASETID',datasetId)
.replace('TABLEID',sourceTable)
var job = {
configuration: {
query: {
query: myQuery,
destinationTable: {
projectId: projectId,
datasetId: datasetId,
tableId: destTable
}
}
}
};
var queryResults = BigQuery.Jobs.insert(job, projectId);
Logger.log(queryResults.status);
}
The 'trick' was a random call to the Drive API to ensure both the BigQuery and Drive scopes are included.
Google Apps Script Project Properties
I've signed up for the free month trial of Azure, and I have created a Mobile Service. I'm using iOS, so I downloaded the model Todo app for iOS.
I am now trying to use Table Storage in the back end instead of a MSSQL store; I have found instructions on using Table Storage here: http://azure.microsoft.com/en-us/documentation/articles/storage-nodejs-how-to-use-table-storage/
However, my app is still storing todo items in the MSSQL storage. I've been told that I don't need to do anything in the client to make the switch, so I assume everything I need to do must be done in the node.js scripts. But I'm clearly missing something.
One thing that confuses me is that after I downloaded the generated node.js script for the Todo app, I didn't see anything in it that seemed to be explicitly talking to the MSSQL database.
Any pointers would be greatly appreciated.
EDIT:
here's my todoitem.insert.js:
var azure = require('azure-storage');
var tableSvc = azure.createTableService();
function insert(item, user, request) {
// request.execute();
console.log('Request received');
console.log(request);
var entGen = azure.TableUtilities.entityGenerator;
var task = {
PartitionKey: entGen.String('learningazure'),
RowKey: entGen.String('1'),
description: entGen.String('add something to TS'),
dueDate: entGen.DateTime(new Date(Date.UTC(2014, 11, 5))),
};
tableSvc.insertEntity('codedelphi',task, {echoContent: true}, function (error, result, response) {
if(!error){
// Entity inserted
console.log('No error on table insert: task created.');
request.respond(statusCodes.SUCCESS, 'OK.');
} else {
console.log('Houston, we have a problem. Entity not added to table.');
console.log(error);
}
});
console.log(JSON.stringify(item, null, 4));
}
tableSvc.createTableIfNotExists('codedelphi', function(error, result, response){
if(!error){
// Table exists or created
console.log('No error, table should exist');
} else {
console.log('We have a problem.');
console.log(error);
}
});
Mobile Services has the built in capability to handle talking to your SQL Database for you. When your script calls "request.execute()" that triggers whatever the request is (insert, update, delete, select) to be ran against the SQL database. Talking to Table Storage instead of SQL requires you to edit those scripts to explicitly talk to Table Storage (i.e. perform your insert, update, deletes, and reads). Today there is no magic switch which will change your "request.execute" from talking to SQL to talk to Table Storage. If you've already edited your scripts to talk to Table Storage and it's not working / you still see data stored in your SQL database, I would suspect that you are either still calling "request.execute" in your scripts, or you haven't pushed them to your Mobile Service (if you've pulled them down locally and then need to push them back to your service). If you've done all of the above, update your question with the Node.js script in question so we can see it.
As Chris pointed out, you are most likely still calling request.execute() from your table scripts. By design, this will explicitly talk to the MSSQL database you configured your application with. You will have to edit your table scripts to not perform "request.execute()" and instead interact with the TableService object.
If you follow the tutorial, and do the following:
1. Import the package.
2. Create the table service object.
3. Create an entity (and modify the variables to store the data you need)
4. Write the entity to your table service.
You should see data being written to table storage rather than SQL database.
Give it a shot and ping back, we'll help you out.
I am working on a website that allows the user to search for the top ten twitter trends in a city or country. At first I was only relying on Twitter's Rest API, but I was having a lot of rate limit issues (at school my rate limit disappears faster than I have a chance to use it). I know that authenticating my API calls will help me to better deal with this issue (Authenticated API calls are charged to the authenticating user’s limit while unauthenticated API calls are deducted from the calling IP address’ allotment).
I implemented #abraham's PHP library (https://github.com/abraham/twitteroauth), unfortunately my API calls aren't being authenticated. I know I have implemented #abraham's PHP library, because it prints out my user information at the end like it should. I have my twitter trend search underneath it but the API call isn't being authenticated. I am not sure how to fix this, and any help would really be appreciated!
This is what I use to get the top ten trends by country:
function showContent(){
// we're going to point to Yahoo's APIs
$BASE_URL = "https://query.yahooapis.com/v1/public/yql";
// the following code should only run if we've submitted a form
if(isset($_REQUEST['location']))
{
// set a variable named "location" to whatever we passed from the form
$location = $_REQUEST['location'];
// Form YQL query and build URI to YQL Web service in two steps:
// first, we show the query
$yql_query = "select woeid from geo.places where text='$location'";
// then we combine the $BASE_URL and query (urlencoded) together
$yql_query_url = $BASE_URL . "?q=" . urlencode($yql_query) . "&format=json";
//var_dump($location);
// show what we're calling
// echo $yql_query_url;
// Make call with cURL (curl pulls webpages - it's very common)
$session = curl_init($yql_query_url);
curl_setopt($session, CURLOPT_RETURNTRANSFER,true);
$json = curl_exec($session);
// Convert JSON to PHP object
$phpObj = json_decode($json);
// Confirm that results were returned before parsing
if(!is_null($phpObj->query->results)){
// Parse results and extract data to display
foreach($phpObj->query->results as $result){
//var_dump($result);
$woeid = $result[0]->woeid;
if (is_numeric ($location))
{
echo "<span style='color:red; padding-left: 245px;'>Please enter a city or a country</span>";
}
else if(empty($result)){
echo "No results found";
}
else {
/* echo "The woeid of $location is $woeid <br />"; */
}
}
}
$jsontrends=file_get_contents("http://api.twitter.com/1/trends/".$woeid.".json");
$phpObj2 = json_decode($jsontrends, true);
echo "<h3 style='margin-top:20px'>TRENDS: ".$phpObj2[0]['locations'][0]['name']."</h3> \r\n";
$data = $phpObj2[0]['trends'];
foreach ($data as $item) {
echo "<br />".$item['name']."\r\n";
echo "<br /> \r\n";
}
if(empty($item)){
echo "No results found";
}
}
}
I then add it to #abraham's html.inc file (along with some php to see the rate limit status) and html.inc is included in the index.php:
<h1>Top Twitter Trends</h1>
<form name='mainForm' method="get">
<input name='location' id='location' type='text'/><br/>
<button id='lookUpTrends'>Submit</button>
</form>
<?php showContent();
$ratelimit = file_get_contents("http://api.twitter.com/1/account/rate_limit_status.json");
echo $ratelimit;
?>
</div>
#abraham's index.php file has some example calls, and since my call doesn't look like this I think that is probably why it isn't being authenticated.
/* Some example calls */
//$connection->post('statuses/update', array('status' => date(DATE_RFC822)));
//$connection->post('statuses/destroy', array('id' => 5437877770));
//$connection->post('friendships/create', array('id' => 9436992));
//$connection->post('friendships/destroy', array('id' => 9436992));
Please help me find what I need to fix so that my API calls are authenticated.
update 10-21
I think in order to make an authenticated API call I need to include something like this is my code:
$connection->get('trends/place', array('id' => $woeid));
It didn't fix my problem, but maybe it is on the right track?
First off, you'll find that keeping your PHP and HTML separate will really help streamline your code and keep logical concerns separate (aggregating the data and displaying it are two different concerns)(many PHPers like MVC).
The code you have shared appears to be correct. My guess is that the issue lies in the creation of the OAuth connection, which should look something like:
<?php
/* Create TwitteroAuth object with app key/secret and token key/secret from default phase */
$connection = new TwitterOAuth(CONSUMER_KEY, CONSUMER_SECRET, $token,$secret);
Where CONSUMER_KEY and CONSUMER_SECRET are from your Trends Test app and $token and $secret are from the user signing in to twitter and allowing your app permission. Are all these values showing up when you create the TwitterOAuth object?
Also, be sure you update the config items in the twitteroauth.php file (specifically line 21 should be set to use the 1.1 API and line 29 should be set to 'json').