I am facing a problem with jetty character encoding. When installed the jetty server on Mac (OSX), it works fine. But, when it is installed on Ubuntu (10.10), the character encoding is not proper.
The word in the page (not URL) having problem is: The New York Times® Bestsellers
It is shown as "The New York Times� Bestsellers" on the page served by the server on Linux
and it is shown as "The New York Times® Bestsellers" on the page served by the server on Mac (This is correct)
The jetty server version is: hightide-7.0.2.v20100331
The character encoding of file served is: UTF-8
Can you please let me know if any settings need to be changed to overcome this problem?
Thanks in advance!
I had a similar problem with jetty 8 and solved it by adding this line to bin/jetty.sh:
JAVA_OPTIONS+=("-Dfile.encoding=UTF-8")
I also had a problem like this and I want to thank aditsu for his answer.
I am using restlet on top of a Jetty server on ubuntu 12.04 (and 14.04). The restlet application is behind an Apache server that functions as a proxyPass.
All files are UTF-8.
All HTTP-responses have Content-Type text/html; charset=UTF-8.
All files contain <meta content="text/html; charset=UTF-8" http-equiv="content-type"/>
The strange thing was that when the server boots and I visit the site, the character encoding was not UTF-8 so I got all those funny characters. Even when all signals were telling the server and agents and everything in between that UTF-8 is de encoding used.
When I restart the service manually after the server boot all characters are fine. Because I could not find an answer easily and I did not know who was causing this wrong encoding I kept restarting the service manually.
My candidates at that time were: Apache, Ubuntu service boot order, Restlet framework, File encoding actually used, HTTP headers, HTML meta tags. But all were as it was supposed to be.
So in the end it was Jetty which I only considered just now after having revisited this issue several times.
I still do not have a clue why starting at boot time makes the character encoding all wrong and after a manual restart of the service the encoding is correct. Adding the extra JAVA argument '-Dfile.encoding=UTF-8' made it all go away. Thanks to aditsu again for sharing his solution!!
Cheers
Edit:
Settting the LANG environment variable in the start up script also solve the problem. I.e.
export LANG=en_US.UTF-8
Actually this is the difference between starting the Jetty server at boot time (LANG is not defined out of the box) and starting it from a shell. So two solutions for the same problem.
Got it; for me, it was missing encoding header of the JSP:
<%# page contentType="text/html;charset=UTF-8" language="java" %>
You are probably reading directly raw http encoding and you need to decode it to utf8 using Decoder.
use java.net.URLDecoder
line = URLDecoder.decode(line, "UTF-8");
For encoding text to html charset, use URLEncoder, like when java String directly to post:
line = URLEncoder.encode(line, "UTF-8");
Related
We have been facing an issue with Chrome browser to open a word file document from Apache2 WebDav server.
(no error output from console.)
I found a related article (https://productforums.google.com/forum/#!topic/chrome/INSmAWDLq7I;context-place=forum/chrome), but this seems not to be solved yet.
I've tried multiple approaches like
Open from Word
or from javascript
window.location.href = "ms-word:ofe|u|:https://example.com"
Apache2 runs on CentOS server, and I am an administrator of the server.
The both approaches above work fine with Safari browser.
solved this myself.....it was just a problem on my end, I couldn't get this working because I was giving a url like
ms-word:ofe|u|'your-webdav-url.
if you simply escape letters
ms-word:ofe%7Cu%7C'your-webdav-url'
I am having a peculiar issue. I am running grails (2.3.5). I think I started developping my app with (2.1.x). The issue I am having is with German character encoding (especially Umlaute).
The issue is that in production (ubuntu Linux) the german special characters are not shown correctly in my browser. They are shown by "?". This thing is not happening in my dev (windows maschine). Additionally earlier scaffolded Domain Object (lets say with grails 2.2) are shown correctly even in production. That is what I am not getting. I was comapring the two "show.gsp" checking the encoding (both are utf-8).
Does anybody have a recommendation where to look at?
Add the java propertie -Dfile.encoding=UTF-8 to your java web-server (tomcat?).
Today I installed v1.9.2 on win7 64-bit server at fixed ip address 192.168.1.99
When I attempt to open the neo4j web admin, using explicit url 192.168.1.99:7474/webadmin/, I get a timeout error. However, when I replace the url with localhost:7474/webadmin/, then I connect without a problem. This is also problematic for my internal webapp as no else can access it without the server url.
Both localhost and 192.168.1.99 worked flawlessly with v1.9.RC2
Can anyone help me resolve this so that I can use the explicit url?
Can anyone recreate this to make sure I am not losing my mind?
Update: I also uncommented the line "org.neo4j.server.webserver.address=0.0.0.0" in the neo4j-server.properties file. When commented, the database only listens on localhost (only accept local connections). This had no effect.
Update2: worked on some javascript - found a bug, fixed it, and now the explicit url is working. Not sure if this is a coincidence. Regardless, hope uncommenting "org.neo4j.server.webserver.address=0.0.0.0" in the neo4j-server.properties file helps someone.
Thanks,
Jeff
I have a tomcat (7) server running, through which I try to access some public files by http. Some of the files on the filesystem have special characters in them. The ones without those special characters are found. The other ones give a 404. For example:
http://localhost:9090/processed/transcoded/Csángó_TÖMEGKERESZTELŐVEL_EGYBEKÖTÖTT_búcsú_Istensegítsfalvá20111053491309424029417_extracted.mp3"
From what I found out utf-8 in urls shouldn't be a problem. I've tried an url escape function on the filename, which resulted in:
http://localhost:9090/processed/transcoded/Cs%c3%a1ng%c3%b3_T%c3%96MEGKERESZTEL%c5%90VEL_EGYBEK%c3%96T%c3%96TT_b%c3%bacs%c3%ba_Istenseg%c3%adtsfalv%c3%a120111053491309424029417_extracted%2emp3
... but that didn't seem to solve anything either. What to try next? I have no clue what the problem is. Is it maybe related to a Tomcat settings?
Do you have URIEncoding="UTF-8" in your <Connector? If yes, here's what I would do:
create a test webapp which has a filter intercepting all calls to /processed/transcoded/*
place a breakpoint on that filter and see what you get. Does the file name make sense when decoded?
try to open a new java.io.File using this path (obviously prepending local location, e.g. /home/someuser/files/... and assuming the file is there).
I don't think tomcat does much more than what is listed above.
Another alternative would be to debug the Tomcat itself.
In a rails app when sending out emails on my development machine the html emails look as expected when viewed, in that both the html and plain text can been seen nested in the proper mimetype when the raw version of the email is viewed, while the html renders properly when viewed normally.
The live server behaves differently. Only the html is sent and the mime type is not defined, which results in the email displaying a lot of html.
I am running up to date versions of archlinux on both the server and my dev machine, with postfix also running on each machine as the mail server.
Any ideas of what could be causing this difference?
You should probably be running vagrant and puppet/chef. Then you can find out.
I had tried restarting the postfix daemon, but that didn't seem to help. Once traffic died down I restarted the server and things worked. There were never any changes made to any of the configuration settings and doing a diff between my local settings and the remote showed no differences.
So I don't have the slightest clue to why things were behaving the way they were.