Specifying charset in HTTP header with Lighttpd - character-encoding

I'm trying to specify a charset in the HTTP header of my Lighttpd-setup. I've tried numerous suggestions I've found throughout StackExchange's websites.
1. Tried looking in the mime.types file, so I could just add ; charset=utf-8 at the end of whatever file-types I wanted to specify a charset for in the HTTP header, but the mime.types-file is looking nothing like I expected: http://pastebin.com/QMKJ8Lqj
2. Tried changing create-mime.assign.pl from this:
#!/usr/bin/perl -w
use strict;
open MIMETYPES, "/etc/mime.types" or exit;
print "mimetype.assign = (\n";
my %extensions;
while(<MIMETYPES>) {
chomp;
s/\#.*//;
next if /^\w*$/;
if(/^([a-z0-9\/+-.]+)\s+((?:[a-z0-9.+-]+[ ]?)+)$/) {
foreach(split / /, $2) {
# mime.types can have same extension for different
# mime types
next if $extensions{$_};
$extensions{$_} = 1;
print "\".$_\" => \"$1\",\n";
}
}
}
print ")\n";
Into this:
#!/usr/bin/perl -w
use strict;
open MIMETYPES, "/etc/mime.types" or exit;
print "mimetype.assign = (\n";
my %extensions;
while(<MIMETYPES>) {
chomp;
s/\#.*//;
next if /^\w*$/;
if(/^([a-z0-9\/+-.]+)\s+((?:[a-z0-9.+-]+[ ]?)+)$/) {
my $pup = $1;
foreach(split / /, $2) {
# mime.types can have same extension for different
# mime types
next if $extensions{$_};
next if not defined $pup;
next if $pup eq '';
$extensions{$_} = 1;
if ($pup =~ /^text\//) {
print "\".$_\" => \"$pup; charset=utf-8\",\n";
} else {
print "\".$_\" => \"$pup\",\n";
}
}
}
}
print ")\n";
And restarted the Lighttpd server afterwards - nothing.
3. Afterwards I tried adding the following to the lighttpd.conf file:
mimetype.assign = (
".css" => "text/css; charset=utf-8",
".html" => "text/html; charset=utf-8",
".htm" => "text/html; charset=utf-8",
".js" => "text/javascript; charset=utf-8",
".text" => "text/plain; charset=utf-8",
".txt" => "text/plain; charset=utf-8",
".xml" => "text/xml; charset=utf-8"
)
And it gave me an error that it couldn't restart the Lighttpd server, because it found duplicate config variables of the "mimetype.assign" variable - one in create-mime.assign.pl and one in lighttpd.conf. I know I could try by removing include_shell "/usr/share/lighttpd/create-mime.assign.pl" from lighttpd.conf, so that there isn't any duplicate config variables, but what about all the other mime-types?
General info:
Lighttpd version: 1.4.28
PHP version: 5.3.29-1
Linux: Debian 6.0 Squeeze
Lighttpd.conf: http://pastebin.com/N6GrdUsi

Please try a newer version of lighttpd.
I am looking at 1.4.36 and doc/scripts/create-mime.conf.pl contains a list of extensions to which it appends "; charset=utf-8"
# text/* subtypes to serve as "text/...; charset=utf-8"
# text/html IS NOT INCLUDED: html has its own method for defining charset
# (<meta>), but the standards specify that content-type in HTTP wins over
# the setting in the html document.
my %text_utf8 = map { $_ => 1 } qw( # ......
You can find it in the git sources: https://github.com/lighttpd/lighttpd1.4/blob/master/doc/scripts/create-mime.conf.pl

Related

Why does S3 change content type of my CSV

I'm using ruby on rails to generate presigned url so I can upload a CSV file. This works perfectly, I can even get the CSV using a presigned URL. The problem is when I get the CSV file a random block of text appears at the top of the csv.
------WebKitFormBoundaryrnmGKBwtkSrSvPUR
Content-Disposition: form-data; name="file"; filename="MOCK_DATA.csv"
Content-Type: application/octet-stream
id,first_name,last_name,email,gender,job_title,city,country
1,Emlyn,Dayce,edayce0#example.com,Male,Quality Control Specialist,Debrecen,Hungary
So when I try to loop through the csv using the below code:
require 'csv'
require 'open-uri'
csv = CSV.new(open(presigned_url), headers: false)
csv.each do |csv|
puts csv.to_s
end
I get the following error:
CSV::MalformedCSVError (Illegal quoting in line 2.):
Which is refering to the line:
csv.each do |csv|
Any solutions on a: how to remove this block of text before looping / while parsing the CSV. Or better yet preventing the block of text from being added in the first place using S3 Presigned URLs.
Note: I have tried to add
content_type: 'text/csv'
to presigned request, however, it doesn't recognize the param.
UPLOADING PROCESS:
I am using Vuejs to upload the csv to S3.
let formData = new FormData();
formData.append("file", this.$refs.file.files[0], { contentType: 'text/csv'
});
this.axios
.put(this.presigned_url, formData, {
headers: {
'Content-Type': 'multipart/form-data'
}
})
.then(response => {
// Handle response
if(response.status == 200){
this.original_file_name = "Processing CSV..."
this.processCsv();
}
});
I was struggling with the same issue, the problem is that you are sending the entire form into S3 and not just the file.
A correct code snippet in your case would be something like this:
this.axios
.put(this.presigned_url, this.$refs.file.files[0], {
headers: {
'Content-Type': 'text/csv'
}
})
.then(response => {
// Handle response
if(response.status == 200){
this.original_file_name = "Processing CSV..."
this.processCsv();
}
});

"Content Encoding Error" when I try to gzip the output

I'm trying to gzip the output of my controller action to save some bandwidth:
new ByteArrayOutputStream().withStream{ baos ->
new GZIPOutputStream( baos ).withWriter{ it << m.text.bytes }
//def gzip = baos.toByteArray().encodeBase64()
def gzip = new String( baos.toByteArray() )
response.setHeader 'Content-Type', 'application/x-javascript'
response.setHeader 'Content-Encoding', 'x-gzip'
response.outputStream.withStream{ it << gzip }
}
}
when I open the url in a browser it gives me
Unknown Error: net::ERR_CONTENT_DECODING_FAILED
in IE
or
Content Encoding Error
in FF
What am I missing?
def index() {
response.setHeader 'Content-Type', 'application/x-javascript'
response.setHeader 'Content-Encoding', 'x-gzip'
new GZIPOutputStream(response.outputStream).withWriter{ it << "Content comes here" }
}
also consider using the capabilities of a webserver in front of your webapp (e.g. apache's gzip module can handle things like this way better). you would also have to check for the capabilities of the client first (Accept-Encoding header in the client request)

ember file upload to rails, encoding and decoding with base64

I'm almost there, but I'm having an issue with decoding of the file. When decoding the file is not correct.
The code that I use to upload the file:
createDataSet: function() {
var data = new FormData();
data.append('original_filename', this.get('fileName'));
data.append('datafile', this.get('newData'));
data.append('project_id', this.get('content.id'));
data.append('name', this.get('content.name'));
$.ajax({
url: '/data_sets.json',
data: data,
cache: false,
contentType: false,
processData: false,
dataType: 'json',
type: 'POST',
success: function(data) {
alert('ok');
},
error: function(xhr, data, errorThrown) {
alert('error');
}
});
}
On the Rails side I'm trying to pick this up with the following method:
def create
# take care of the attachement
datasetfilename = Pathname.new(params[:original_filename]).basename
newfile = File.open(datasetfilename, 'w') do |f|
f.write(Base64.decode64(params[:datafile]))
end
#dataset = DataSet.new
#active_data_set = #dataset.active_data_sets.build
#active_data_set.project_id = params[:project_id]
#active_data_set.save
#dataset.name = params[:name]
#dataset.filename = datasetfilename
#dataset.tempfilename = #dataset.savefile newfile
#dataset.save
end
If I use File.open(datasetfilename, 'w') I get an error like this one Encoding::UndefinedConversionError - "\xAB" from ASCII-8BIT to UTF-8. On the other hand, if I open with 'wb' the resulting file is mingled and can't be read.
I already added the meta tag for the file encoding <meta charset="utf-8" /> but without any difference.
If anybody has any hint that would be appreciated.
Just got this working in one of my own controllers, there are 2 main issues:
1) to resolve the encoding issue, use "w:binary" as the write flag instead of "w" (defaults to ASCII)
2) the :datafile params includes some header info "data:text/csv;base64,SUR4CUluZ...", I'm currently splitting on "," but might be better served to decode everything beyond "base64," as I'm not sure if additional commas are allowed.
My working code (slightly different parameter names):
if params.key?(:img_file)
header, data = params[:img_file].split(',')
img_type = header.match(/image\/([a-z]{1,11});/)[1]
file_path = "imgtodo/fund_#{#fund.id}.#{img_type}"
File.open(Rails.root.join('public',file_path).to_s, 'w:binary') do |f|
f.write(Base64.decode64(data))
end
end

Content Type not being passed in functional test using sfBrowser in Symfony 1.4

The test is for a POST api endpoint where the data is contained in the body of the post as JSON. Prior to making the post call I set the Content-Type to 'application/json'. However, when I test the format isFormat('JSON') the response is null. If I dump the $request->contentType() this also produces null.
Any reason to why setHttpHeader('Content-Type','application/json') is not setting the header correctly during functional testing?
Your setting method is correct, but inside sfBrowserBase there is this bug:
foreach ($this->headers as $header => $value)
{
$_SERVER['HTTP_'.strtoupper(str_replace('-', '_', $header))] = $value;
}
that set content_type with prefix HTTP. But in your action $request->getContentType() method suppose that you don't have prefix.
so if you change this:
foreach ($this->headers as $header => $value)
{
$_SERVER[strtoupper(str_replace('-', '_', $header))] = $value;
}
you can make $request->getContentType() correctly!
You can find update here .
Right thanks to #nicolx I can explain more about what is happening and offer some further guidance.
As noted by #nicolx $request->getContentType() is looking for HTTP header without the prefix HTTP_ (see line 163 to 173 in sfWebRequest). However, sfBrowserBase always adds HTTP_ prefix to all headers. So add in this mod:
foreach($this->headers as $header => $value)
{
if(strotolower($header) == 'content-type' || strtolower($header) == 'content_type')
{
$_SERVER[strtoupper(str_replace('-','_',$header))] = $value;
} else {
$_SERVER['HTTP_'.strtoupper(str_replace('-','_',$header))] = $value;
}
}
That will deal with ContentType header being set and detectable in your actions. If you don't include the HTTP_ prefix the other headers won't work (e.g. $request->isXmlHtttpHeader() will fail even if you set this in the test file).
The test method isFormat() isn't testing the ContentType header but the Symfony route setting sf_format. If I set the route to specifically have sf_format: json e.g.
some_route:
url: /something/to/do
param: {module: top, action: index, sf_format: json}
then the test
with('request')->begin()->
isFormat('json')->
end()->
returns true.
As I wanted to test the header settings, I added a new tester method to sfTesterRequest called isContentType(). The code for this method is:
public function isContentType($type)
{
$this->tester->is($this->request->getContentType(),$type, sprintf('request method is "%s"',strtoupper($type)));
return $this->getObjectToReturn();
}
Calling this test simply becomes:
with('request')->begin()->
isContentType('Application/Json')->
end()->

How do you POST to a page using the PHP header() function?

I found the following code on here that I think does what I want, but it doesn't work:
$host = "www.example.com";
$path = "/path/to/script.php";
$data = "data1=value1&data2=value2";
$data = urlencode($data);
header("POST $path HTTP/1.1\r\n");
header("Host: $host\r\n");
header("Content-type: application/x-www-form-urlencoded\r\n");
header("Content-length: " . strlen($data) . "\r\n");
header("Connection: close\r\n\r\n");
header($data);
I'm looking to post form data without sending users to a middle page and then using JavaScript to redirect them. I also don't want to use GET so it isn't as easy to use the back button.
Is there something wrong with this code? Or is there a better method?
Edit I was thinking of what the header function would do. I was thinking I could get the browser to post back to the server with the data, but this isn't what it's meant to do. Instead, I found a way in my code to avoid the need for a post at all (not breaking and just continuing onto the next case within the switch).
The header function is used to send HTTP response headers back to the user (i.e. you cannot use it to create request headers.
May I ask why are you doing this? Why simulate a POST request when you can just right there and then act on the data someway? I'm assuming of course script.php resides on your server.
To create a POST request, open a up a TCP connection to the host using fsockopen(), then use fwrite() on the handler returned from fsockopen() with the same values you used in the header functions in the OP. Alternatively, you can use cURL.
The answer to this is very needed today because not everyone wants to use cURL to consume web services. Also PHP does allow for this using the following code
function get_info()
{
$post_data = array(
'test' => 'foobar',
'okay' => 'yes',
'number' => 2
);
// Send a request to example.com
$result = $this->post_request('http://www.example.com/', $post_data);
if ($result['status'] == 'ok'){
// Print headers
echo $result['header'];
echo '<hr />';
// print the result of the whole request:
echo $result['content'];
}
else {
echo 'A error occured: ' . $result['error'];
}
}
function post_request($url, $data, $referer='') {
// Convert the data array into URL Parameters like a=b&foo=bar etc.
$data = http_build_query($data);
// parse the given URL
$url = parse_url($url);
if ($url['scheme'] != 'http') {
die('Error: Only HTTP request are supported !');
}
// extract host and path:
$host = $url['host'];
$path = $url['path'];
// open a socket connection on port 80 - timeout: 30 sec
$fp = fsockopen($host, 80, $errno, $errstr, 30);
if ($fp){
// send the request headers:
fputs($fp, "POST $path HTTP/1.1\r\n");
fputs($fp, "Host: $host\r\n");
if ($referer != '')
fputs($fp, "Referer: $referer\r\n");
fputs($fp, "Content-type: application/x-www-form-urlencoded\r\n");
fputs($fp, "Content-length: ". strlen($data) ."\r\n");
fputs($fp, "Connection: close\r\n\r\n");
fputs($fp, $data);
$result = '';
while(!feof($fp)) {
// receive the results of the request
$result .= fgets($fp, 128);
}
}
else {
return array(
'status' => 'err',
'error' => "$errstr ($errno)"
);
}
// close the socket connection:
fclose($fp);
// split the result header from the content
$result = explode("\r\n\r\n", $result, 2);
$header = isset($result[0]) ? $result[0] : '';
$content = isset($result[1]) ? $result[1] : '';
// return as structured array:
return array(
'status' => 'ok',
'header' => $header,
'content' => $content);
}
In addition to what Salaryman said, take a look at the classes in PEAR, there are HTTP request classes there that you can use even if you do not have the cURL extension installed in your PHP distribution.
There is a good class that does what you want. It can be downloaded at: http://sourceforge.net/projects/snoopy/
private function sendHttpRequest($host, $path, $query, $port=80){
header("POST $path HTTP/1.1\r\n" );
header("Host: $host\r\n" );
header("Content-type: application/x-www-form-urlencoded\r\n" );
header("Content-length: " . strlen($query) . "\r\n" );
header("Connection: close\r\n\r\n" );
header($query);
}
This will get you right away

Resources