I'm trying to do something really simple: just send a string to a server and save it as a txt file. I'm getting nothing at the moment - no txt file - nothing. I'm looking for really simple solutions. Here is what I have:
<script>
function submit()
{
var xmlhttp = new XMLHttpRequest();
xmlhttp.open("POST", "test.php?d=Hey", true);
xmlhttp.setRequestHeader("Content-Type: text/plain");
xmlhttp.send();
}
</script>
<?php
$d = $_POST["d"];
$file = fopen("test.txt", "a");
fwrite($file, $d + "\n");
fclose($file);
?>
Related
I tried all possible methods and always got the same error, can someone help me?? and give me some advice?
pdt. i am new to web development
thanks
<?php
use \Psr\Http\Message\ServerRequestInterface as Request;
use \Psr\Http\Message\ResponseInterface as Response;
$app = new \Slim\App;
$app->get('/api/products', function(Request $req, Response $res){
$query = 'SELECT * FROM products';
try{
$db = new database();
$db = $db->connection();
$stmt = $db->query($query);
$products = $stmt->fetch_object();
echo json_encode($products);
}catch(Exception $e){
echo $e;
}
});
$app->post('/api/add',function()use($app){
$request = Slim::getInstance()->request();
$name = json_decode($request->getBody());
//$content = file_get_contents($req);
//$json = json_decode($content, true);
//$post = $id->post;
echo $name;
});
Follow the tutorial. You need to require Composer's autoloader.
If you check your PHP error log, you'll see the errors.
This XML file, which can be accessed here # http://afdclinics.com/persistentpresence/category/brentwood/lobby-1/feed/ - has a custom_fields node with 2 fields called custom-bgcolor, and custom-fontcolor. I have tried numerous ways to try, and access the data inside them with no luck.
I have been accessing other nodes with simplexml, but haven't been able to get the custom_fields working. Here is what I have so far.
$curl = curl_init();
curl_setopt ($curl, CURLOPT_URL,'http://afdclinics.com/persistentpresence/category/brentwood/lobby-1/feed/');
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$result = curl_exec ($curl);
if ($result === false) {
die('Error fetching data: ' . curl_error($curl));
}
curl_close ($curl);
//we can at this point echo the XML if you want
//echo $result;
//parse xml string into SimpleXML objects
$xml = simplexml_load_string($result);
if ($xml === false) {
die('Error parsing XML');
}
//now we can loop through the xml structure
foreach ($xml->channel->item as $item) {
//print $item->title; rss feed article title
//print $item->description; rss feed article description
//print $item->link; rss feed article link
//print $item->pubDate; rss feed article publish date
print $item->children('content', true); //rss feed article content
// here is where is would like to print the custom values
print $item->custom_fields->custom-bgcolor; // this line doesn't seem to work
//gets img url's and appends them to offline manifest file
$imgUrl = array();
$doc2 = new DOMDocument();
$doc2->loadHTML($item->children('content', true));
$imgUrl = simplexml_import_dom($doc2);
$images = $imgUrl->xpath('//img');
foreach ($images as $img) {
$imgUrl = $img['src'] . "\r\n";
print $imgUrl; //rss feed image url's
$i++;
}
I show a lot of thumbs on my homepage from youtube videos. I was using this function below to grab the thumb from a youtube url which works fast but it doesn't work for url's in the shortned form like youtu.be/JSHDLSKL.
function get_youtube_screen_link( $url = '', $type = 'default', $echo = true ) {
if( empty( $url ) )
return false;
if( !isset( $type ) )
$type = '';
$url = esc_url( $url );
preg_match("|[\\?&]v=([^&#]*)|",$url,$vid_id);
if( !isset( $vid_id[1] ) )
return false;
$img_server_num = 'i'. rand(1,4);
switch( $type ) {
case 'large':
$img_link = "http://{$img_server_num}.ytimg.com/vi/{$vid_id[1]}/0.jpg";
break;
case 'first':
// Thumbnail of the first frame
$img_link = "http://{$img_server_num}.ytimg.com/vi/{$vid_id[1]}/1.jpg";
break;
case 'small':
// Thumbnail of a later frame(i'm not sure how they determine this)
$img_link = "http://{$img_server_num}.ytimg.com/vi/{$vid_id[1]}/2.jpg";
break;
case 'default':
case '':
default:
$img_link = "http://{$img_server_num}.ytimg.com/vi/{$vid_id[1]}/default.jpg";
break;
}
if( $echo )
echo $img_link;
else
return $img_link;
}
So I tried to use Oembed to get the thumbs instead which works for all variations of the youtube url but it retrieves the 480px/360px thumb which causes a lot of cropping to get it down to the 120px/90px size I use to display them. The other issue was it caused my page speed to increase by 4 seconds which Im guessing is a problem with the way I'm implementing it. Here's how I call the thumb inside a loop.
<?php
require_once(ABSPATH.'wp-includes/class-oembed.php');
$oembed= new WP_oEmbed;
$name = get_post_meta($post->ID,'video_code',true);
$url = $name;
//As noted in the comments below, you can auto-detect the video provider with the following
$provider = $oembed->discover($name);
//$provider = 'http://www.youtube.com/oembed';
$video = $oembed->fetch($provider, $url, array('width' => 300, 'height' => 175));
$thumb = $video->thumbnail_url; if ($thumb) { ?>
<img src="<?php echo $thumb; ?>" width="120px" height="90px" />
<?php } ?>
So how should I be doing this to maximize efficiency?
I came across this page from youtube explaining their oembed support, They mentioned that they output to json format so I made a function that gets the json data and then enables you to use it.
Feel free to ask if you need more help.
<?php
$youtube_url = 'http://youtu.be/oHg5SJYRHA0'; // url to youtube video
function getJson($youtube_url){
$baseurl = 'http://www.youtube.com/oembed?url='; // youtube oembed base url
$url = $baseurl . $youtube_url . '&format=json'; // combines the url with format json
$json = json_decode(file_get_contents($url)); // gets url and decodes the json
return $json;
}
$json = getJson($youtube_url);
// from this point on you have all your data placed in variables.
$provider_url = $json->{'provider_url'};
$thumbnail_url = $json->{'thumbnail_url'};
$title = $json->{'title'};
$html = $json->{'html'};
$author_name = $json->{'author_name'};
$height = $json->{'height'};
$thumbnail_width = $json->{'thumbnail_width'};
$thumbnail_height = $json->{'thumbnail_height'};
$width = $json->{'width'};
$version = $json->{'version'};
$author_url = $json->{'author_url'};
$provider_name = $json->{'provider_name'};
$type = $json->{'type'};
echo '<img src="'.$thumbnail_url.'" />'; // echo'ing out the thumbnail image
Ok I came up with a solution from pieces of other questions. First we need to get the id from any type of url youtube has using this function.
function getVideoId($url)
{
$parsedUrl = parse_url($url);
if ($parsedUrl === false)
return false;
if (!empty($parsedUrl['query']))
{
$query = array();
parse_str($parsedUrl['query'], $query);
if (!empty($query['v']))
return $query['v'];
}
if (strtolower($parsedUrl['host']) == 'youtu.be')
return trim($parsedUrl['path'], '/');
return false;
}
Now we can get use YouTube Data API to get the thumbnail from the video id. Looks like this.
<?php
$vid_id = getVideoId($video_code);
$json = json_decode(file_get_contents("http://gdata.youtube.com/feeds/api/videos/$vid_id?v=2&alt=jsonc"));
echo '<img src="' . $json->data->thumbnail->sqDefault . '" width="176" height="126">';
?>
The problem is that is causing an extra 2 seconds load time so I simply use the $vid_id and place it inside http://i3.ytimg.com/vi/<?php echo $vid_id; ?>/default.jpg which gets rid of the 2 seconds added by accessing the youtube api.
I am using tmhOAuth.php / class to login into twitter. I have been successful in logging in and sending a tweet.
When I go to use the friends.php script, I am having some problems inserting into my database. I do believe that my problem lies some where with the $paging variable in the code. Because it is looping only seven times. I am following 626 people so my $ids is 626 and then $paging is 7.
When I run the php in a web browser, I am only able to extract 7 of the followers (ie following user #626,following user 526,following user 426...) It seem to be echoing the last user on each page request. This due in part to requesting 100 user ids at a time, via the PAGESIZE constant. When I adjust the $paging with different number such as the number 626 I get {"errors":[{"code":17,"message":"No user matches for specified terms"}]}
Unfortunately, I suspect this is fairly simple php looping problem, but after the amount of time I have spent trying to crack this I can no longer think straight.
Thanks in advance.
define('PAGESIZE', 100);
require 'tmhOAuth.php';
require 'tmhUtilities.php';
if ($tmhOAuth->response['code'] == 200) {
$data = json_decode($tmhOAuth->response['response'], true);
$ids += $data['ids'];
$cursor = $data['next_cursor_str'];
} else {
echo $tmhOAuth->response['response'];
break;
}
endwhile;
// lookup users
$paging = ceil(count($ids) / PAGESIZE);
$users = array();
for ($i=0; $i < $paging ; $i++) {
$set = array_slice($ids, $i*PAGESIZE, PAGESIZE);
$tmhOAuth->request('GET', $tmhOAuth->url('1/users/lookup'), array(
'user_id' => implode(',', $set)
));
// check the rate limit
check_rate_limit($tmhOAuth->response);
if ($tmhOAuth->response['code'] == 200) {
$data = json_decode($tmhOAuth->response['response'], true);
if ($tmhOAuth->response['code'] == 200) {
$data = json_decode($tmhOAuth->response['response'], true);
$name = array();
foreach ($data as $val)
{
$name = $data[0]['screen_name'];
}
echo "this is the screen name " .$name. "\n";
$users += $data;
} else {
echo $tmhOAuth->response['response'];
break;
}
}
var_dump($users);
?>
The data I am trying to echo, then parse and insert into database is the standard twitter JSON data, so I won't include this in the message. Any help would be
Problem solved:
foreach ($data as $val)
{
$name = $val['screen_name'];
echo "this is the screen name " .$name. "\n";
$users[] = $name;
}
I've just got my MVC view export to Excel working great, however, because i'm setting location.href this is leaving me with a page full of CSV data rather than the neat grid results before the user hit EXPORT button.
I'm trying to think how to change the following script to do what its doing but to leave the page alone. I tried making a call to the search to go back to server again but at runtime user see's the CSV on the webpage momentarily, which is not good.
Any ideas much appreciated, Cheers
$(function() {
$('#exportButton').click(function() {
var url = $('#AbsolutePath').val() + 'Waste.mvc/Export';
var data = {
searchText: $('#SearchTextBox').val().toString(),
searchTextSite: $('#SearchTextBoxSite').val().toString(),
StartDate: $('#StartDate').val(),
EndDate: $('#EndDate').val()
};
$('#ResultsList').load(url, data, function() {
$('#LoadingGif').empty();
location.href = url + "?searchText=" + data.searchText + "&searchTextSite=" + data.searchTextSite + "&StartDate=" + data.StartDate + "&EndDate=" + data.EndDate;
});
//Search(); this fixes because grid is displayed again after csv results
});
});
My Controller Code:
public FileStreamResult Export(string searchText, string searchTextSite, string StartDate, string EndDate)
{
var searchResults = getSearchResults(searchText, searchTextSite, StartDate, EndDate);
HttpContext.Response.AddHeader("content-disposition", "attachment; filename=Export.csv");
var sw = new StreamWriter(new MemoryStream());
sw.WriteLine("\"Ref\",\"Source\",\"Collected\"");
foreach (var line in searchResults.ToList())
{
sw.WriteLine(string.Format("\"{0}\",\"{1}\",\"{2}\"",
line.WasteId,
line.SourceWasteTypeId.ToDescription(),
line.CollectedDate.ToShortDateString()));
}
sw.Flush();
sw.BaseStream.Seek(0, SeekOrigin.Begin);
return new FileStreamResult(sw.BaseStream, "text/csv");
// return File(sw.BaseStream, "text/csv", "report.csv"); Renders the same result
}
You could have the controller action return the CSV file as attachment (it will use the Content-Disposition: attachment; filename=report.csv HTTP header):
public ActionResult GetCsv()
{
byte[] csvData = ...
return File(csvData, "text/csv", "report.csv");
}
Now you can safely do a window.location.href = '/reports/getcsv'; and the user will be prompted to download the CSV report but it will stay on the same page.
Thanks for the guidance, it helped get me there. I set a DIV with visibility to hidden:
<div id="ExportList" style="visibility:hidden;clear:both;">
</div>
I placed this hidden div below the DIV that has the grid results, the grid results stay in place and the hidden CSV stream is written to page and popup prompt appears then open to Excel
Here is the Javascipt I finished on, I think this could be neater though. Not having to use the extra DIV, but its not too bad :
$(function() {
$('#exportButton').click(function() {
var url = $('#AbsolutePath').val() + 'Waste.mvc/Export';
var data = {
searchText: $('#SearchTextBox').val().toString(),
searchTextSite: $('#SearchTextBoxSite').val().toString(),
StartDate: $('#StartDate').val(),
EndDate: $('#EndDate').val()
};
$('#ExportList').load(url, data, function() {
$('#LoadingGif').empty();
location.href = url + "?searchText=" + data.searchText + "&searchTextSite=" + data.searchTextSite + "&StartDate=" + data.StartDate + "&EndDate=" + data.EndDate;
});
});
});