how to get a browser content as a string in blackberry? - blackberry

I want to display the content of any given URL as a string.....
Can any one give some sample code here?

String url = "http://google.com";
HttpConnection _conn = (HttpConnection) Connector.open(url);
int rc = _conn.getResponseCode();
System.out.println("RC : " + rc);
if(rc != 200)
return;
InputStream is = null;
byte[] result = null;
is = _conn.openInputStream();
// Get the ContentType
String type = _conn.getType();
// Get the length and process the data
int len = (int)_conn.getLength();
if (len > 0) { // If data lenght is defined
int actual = 0;
int bytesread = 0;
result = new byte[len];
while ((bytesread != len) && (actual != -1)) {
actual = is.read(result, bytesread, len - bytesread);
bytesread += actual;
}
}else { // If no data lenght is not defined in HTTP response
// Data accumulation buffer (for whole data)
NoCopyByteArrayOutputStream outputStream = new NoCopyByteArrayOutputStream(1024);
// Receive buffer (for each portion of data)
byte[] buff = new byte[1024];
while ((len = is.read(buff)) > 0) {
// Write received portion of data into accumulation stream
outputStream.write(buff, 0, len);
}
result = outputStream.toByteArray();
System.out.println(new String(result));
}

Related

AudioConverter#FillComplexBuffer returns -50 and does not convert anything

I'm strongly following this Xamarin sample (based on this Apple sample) to convert a LinearPCM file to an AAC file.
The sample works great, but implemented in my project, the FillComplexBuffer method returns error -50 and the InputData event is not triggered once, thus nothing is converted.
The error only appears when testing on a device. When testing on the emulator, everything goes great and I get a good encoded AAC file at the end.
I tried a lot of things today, and I don't see any difference between my code and the sample code. Do you have any idea where this may come from?
I don't know if this is in anyway related to Xamarin, it doesn't seem so since the Xamarin sample works great.
Here's the relevant part of my code:
protected void Encode(string path)
{
// In class setup. File at TempWavFilePath has DecodedFormat as format.
//
// DecodedFormat = AudioStreamBasicDescription.CreateLinearPCM();
// AudioStreamBasicDescription encodedFormat = new AudioStreamBasicDescription()
// {
// Format = AudioFormatType.MPEG4AAC,
// SampleRate = DecodedFormat.SampleRate,
// ChannelsPerFrame = DecodedFormat.ChannelsPerFrame,
// };
// AudioStreamBasicDescription.GetFormatInfo (ref encodedFormat);
// EncodedFormat = encodedFormat;
// Setup converter
AudioStreamBasicDescription inputFormat = DecodedFormat;
AudioStreamBasicDescription outputFormat = EncodedFormat;
AudioConverterError converterCreateError;
AudioConverter converter = AudioConverter.Create(inputFormat, outputFormat, out converterCreateError);
if (converterCreateError != AudioConverterError.None)
{
Console.WriteLine("Converter creation error: " + converterCreateError);
}
converter.EncodeBitRate = 192000; // AAC 192kbps
// get the actual formats back from the Audio Converter
inputFormat = converter.CurrentInputStreamDescription;
outputFormat = converter.CurrentOutputStreamDescription;
/*** INPUT ***/
AudioFile inputFile = AudioFile.OpenRead(NSUrl.FromFilename(TempWavFilePath));
// init buffer
const int inputBufferBytesSize = 32768;
IntPtr inputBufferPtr = Marshal.AllocHGlobal(inputBufferBytesSize);
// calc number of packets per read
int inputSizePerPacket = inputFormat.BytesPerPacket;
int inputBufferPacketSize = inputBufferBytesSize / inputSizePerPacket;
AudioStreamPacketDescription[] inputPacketDescriptions = null;
// init position
long inputFilePosition = 0;
// define input delegate
converter.InputData += delegate(ref int numberDataPackets, AudioBuffers data, ref AudioStreamPacketDescription[] dataPacketDescription)
{
// how much to read
if (numberDataPackets > inputBufferPacketSize)
{
numberDataPackets = inputBufferPacketSize;
}
// read from the file
int outNumBytes;
AudioFileError readError = inputFile.ReadPackets(false, out outNumBytes, inputPacketDescriptions, inputFilePosition, ref numberDataPackets, inputBufferPtr);
if (readError != 0)
{
Console.WriteLine("Read error: " + readError);
}
// advance input file packet position
inputFilePosition += numberDataPackets;
// put the data pointer into the buffer list
data.SetData(0, inputBufferPtr, outNumBytes);
// add packet descriptions if required
if (dataPacketDescription != null)
{
if (inputPacketDescriptions != null)
{
dataPacketDescription = inputPacketDescriptions;
}
else
{
dataPacketDescription = null;
}
}
return AudioConverterError.None;
};
/*** OUTPUT ***/
// create the destination file
var outputFile = AudioFile.Create (NSUrl.FromFilename(path), AudioFileType.M4A, outputFormat, AudioFileFlags.EraseFlags);
// init buffer
const int outputBufferBytesSize = 32768;
IntPtr outputBufferPtr = Marshal.AllocHGlobal(outputBufferBytesSize);
AudioBuffers buffers = new AudioBuffers(1);
// calc number of packet per write
int outputSizePerPacket = outputFormat.BytesPerPacket;
AudioStreamPacketDescription[] outputPacketDescriptions = null;
if (outputSizePerPacket == 0) {
// if the destination format is VBR, we need to get max size per packet from the converter
outputSizePerPacket = (int)converter.MaximumOutputPacketSize;
// allocate memory for the PacketDescription structures describing the layout of each packet
outputPacketDescriptions = new AudioStreamPacketDescription [outputBufferBytesSize / outputSizePerPacket];
}
int outputBufferPacketSize = outputBufferBytesSize / outputSizePerPacket;
// init position
long outputFilePosition = 0;
long totalOutputFrames = 0; // used for debugging
// write magic cookie if necessary
if (converter.CompressionMagicCookie != null && converter.CompressionMagicCookie.Length != 0)
{
outputFile.MagicCookie = converter.CompressionMagicCookie;
}
// loop to convert data
Console.WriteLine ("Converting...");
while (true)
{
// create buffer
buffers[0] = new AudioBuffer()
{
NumberChannels = outputFormat.ChannelsPerFrame,
DataByteSize = outputBufferBytesSize,
Data = outputBufferPtr
};
int writtenPackets = outputBufferPacketSize;
// LET'S CONVERT (it's about time...)
AudioConverterError converterFillError = converter.FillComplexBuffer(ref writtenPackets, buffers, outputPacketDescriptions);
if (converterFillError != AudioConverterError.None)
{
Console.WriteLine("FillComplexBuffer error: " + converterFillError);
}
if (writtenPackets == 0) // EOF
{
break;
}
// write to output file
int inNumBytes = buffers[0].DataByteSize;
AudioFileError writeError = outputFile.WritePackets(false, inNumBytes, outputPacketDescriptions, outputFilePosition, ref writtenPackets, outputBufferPtr);
if (writeError != 0)
{
Console.WriteLine("WritePackets error: {0}", writeError);
}
// advance output file packet position
outputFilePosition += writtenPackets;
if (FlowFormat.FramesPerPacket != 0) {
// the format has constant frames per packet
totalOutputFrames += (writtenPackets * FlowFormat.FramesPerPacket);
} else {
// variable frames per packet require doing this for each packet (adding up the number of sample frames of data in each packet)
for (var i = 0; i < writtenPackets; ++i)
{
totalOutputFrames += outputPacketDescriptions[i].VariableFramesInPacket;
}
}
}
// write out any of the leading and trailing frames for compressed formats only
if (outputFormat.BitsPerChannel == 0)
{
Console.WriteLine("Total number of output frames counted: {0}", totalOutputFrames);
WritePacketTableInfo(converter, outputFile);
}
// write the cookie again - sometimes codecs will update cookies at the end of a conversion
if (converter.CompressionMagicCookie != null && converter.CompressionMagicCookie.Length != 0)
{
outputFile.MagicCookie = converter.CompressionMagicCookie;
}
// Clean everything
Marshal.FreeHGlobal(inputBufferPtr);
Marshal.FreeHGlobal(outputBufferPtr);
converter.Dispose();
outputFile.Dispose();
// Remove temp file
File.Delete(TempWavFilePath);
}
I already saw this SO question, but the not-detailed C++/Obj-C related answer doesn't seem to fit with my problem.
Thanks !
I finally found the solution!
I just had to declare AVAudioSession category before converting the file.
AVAudioSession.SharedInstance().SetCategory(AVAudioSessionCategory.AudioProcessing);
AVAudioSession.SharedInstance().SetActive(true);
Since I also use an AudioQueue to RenderOffline, I must in fact set the category to AVAudioSessionCategory.PlayAndRecord so both the offline rendering and the audio converting work.

is "HttpWebRequest" using of my site's Bandwidth?

I want to read mp3 file from external url and hide url path for client in my website.
I'm using this code:
//Create a stream for the file
Stream stream = null;
//This controls how many bytes to read at a time and send to the client
int bytesToRead = 10000;
// Buffer to read bytes in chunk size specified above
byte[] buffer = new Byte[bytesToRead];
// The number of bytes read
try
{
//Create a WebRequest to get the file
HttpWebRequest fileReq = (HttpWebRequest)HttpWebRequest.Create("http://www.alsacreations.fr/mp3/everywhere.mp3");
//Create a response for this request
HttpWebResponse fileResp = (HttpWebResponse)fileReq.GetResponse();
if (fileReq.ContentLength > 0)
fileResp.ContentLength = fileReq.ContentLength;
//Get the Stream returned from the response
stream = fileResp.GetResponseStream();
// prepare the response to the client. resp is the client Response
var resp = HttpContext.Current.Response;
//Indicate the type of data being sent
resp.ContentType = "application/octet-stream";
//Name the file
string fileName = "everywhere.mp3";
resp.AddHeader("Content-Disposition", "attachment; filename=\"" + fileName);
resp.AddHeader("Content-Length", fileResp.ContentLength.ToString());
int length;
do
{
// Verify that the client is connected.
if (resp.IsClientConnected)
{
// Read data into the buffer.
length = stream.Read(buffer, 0, bytesToRead);
// and write it out to the response's output stream
resp.OutputStream.Write(buffer, 0, length);
// Flush the data
resp.Flush();
//Clear the buffer
buffer = new Byte[bytesToRead];
}
else
{
// cancel the download if client has disconnected
length = -1;
}
} while (length > 0); //Repeat until no data is read
}
finally
{
if (stream != null)
{
//Close the input stream
stream.Close();
}
}
This code is work correctly, My question is for download this mp3 file(http://www.alsacreations.fr/mp3/everywhere.mp3) use My Site's BandWidth or not?

How to reduce memory when loading image from website?

I am using this Utility
public class Util_ImageLoader {
public static Bitmap _bmap;
Util_ImageLoader(String url) {
HttpConnection connection = null;
InputStream inputStream = null;
EncodedImage bitmap;
byte[] dataArray = null;
try {
connection = (HttpConnection) Connector.open(url + Util_GetInternet.getConnParam(), Connector.READ,
true);
inputStream = connection.openInputStream();
byte[] responseData = new byte[10000];
int length = 0;
StringBuffer rawResponse = new StringBuffer();
while (-1 != (length = inputStream.read(responseData))) {
rawResponse.append(new String(responseData, 0, length));
}
int responseCode = connection.getResponseCode();
if (responseCode != HttpConnection.HTTP_OK) {
throw new IOException("HTTP response code: " + responseCode);
}
final String result = rawResponse.toString();
dataArray = result.getBytes();
} catch (final Exception ex) {
}
finally {
try {
inputStream.close();
inputStream = null;
connection.close();
connection = null;
} catch (Exception e) {
}
}
bitmap = EncodedImage
.createEncodedImage(dataArray, 0, dataArray.length);
int multH;
int multW;
int currHeight = bitmap.getHeight();
int currWidth = bitmap.getWidth();
multH = Fixed32.div(Fixed32.toFP(currHeight), Fixed32.toFP(currHeight));// height
multW = Fixed32.div(Fixed32.toFP(currWidth), Fixed32.toFP(currWidth));// width
bitmap = bitmap.scaleImage32(multW, multH);
_bmap = bitmap.getBitmap();
}
public Bitmap getbitmap() {
return _bmap;
}
}
When I call it in a listfield which contains 10 childs, then the log keeps saying failed to allocate timer 0: no slots left.
This means the memory is being used up and no more memory to allocate again and as a result my main screen cannot start.
At the same time you have the following objects in memory:
// A buffer of about 10KB
byte[] responseData = new byte[10000];
// A string buffer which will grow up to the total response size
rawResponse.append(new String(responseData, 0, length));
// Another string the same length that string buffer
final String result = rawResponse.toString();
// Now another buffer the same size of the response.
dataArray = result.getBytes();
It total, if you downloaded n ascii chars, you have simultaneously 10KB, plus 2*n bytes in the first unicode string buffer, plus 2*n bytes in the result string, plus n bytes in dataArray. If I'm not wrong, that sums up to 5n + 10k. There's room for optimization.
Some improvements would be:
Check response code first, and then read the stream if response code is HTTP 200. No need to read if server returned an error.
Get rid of strings. No need to convert to string if after that you are converting again to bytes.
If images are large, don't store them in RAM while downloading. Instead, open a FileOutputStream and write to a temporary file as you read from input stream. Then, if temporary images are still large enough to be displayed, downscale them.

Connection Closed When trying to post over 1.5K of url encoded data on 8900, and maybe others

From the simulator, this all works.
I'm using wifi on the device as i'm assuming it's the most stable.
The problem occurs when i try to post more than 1.5K of urlencoded data.
If i send less then it's fine.
It seems to hang the .flush command();
It works on a physical 9700, so i'm presuming that it's possibly device specific
In the example below i'm using form variables, but i've also tried posting the content type json, but still had the same issue
I've written a small testapp, and using the main thread so i know that it's not threads getting confused
If anyone has any ideas that would be great.
private String PostEventsTest()
{
String returnValue = "Error";
HttpConnection hc = null;
DataInputStream dis = null;
DataOutputStream dos = null;
StringBuffer messagebuffer = new StringBuffer();
URLEncodedPostData postValuePairs;
try
{
postValuePairs = new URLEncodedPostData(null, false);
postValuePairs.append("DATA",postData);// postData);
hc = (HttpConnection) Connector.open(postURL, Connector.READ_WRITE);
hc.setRequestMethod(HttpConnection.POST);
hc.setRequestProperty("User-Agent", "BlackBerry");
hc.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
hc.setRequestProperty("Content-Length", Integer.toString(postValuePairs.getBytes().length));
//hc.setRequestProperty("Content-Length", Integer.toString(postData.length()));
dos = hc.openDataOutputStream();
dos.write(postValuePairs.getBytes());
dos.flush();
dos.close();
// Retrieve the response back from the servlet
dis = new DataInputStream(hc.openInputStream());
int ch;
// Check the Content-Length first
long len = hc.getLength();
if (len != -1)
{
for (int i = 0; i < len; i++)
if ((ch = dis.read()) != -1)
messagebuffer.append((char) ch);
}
else
{ // if the content-length is not available
while ((ch = dis.read()) != -1)
messagebuffer.append((char) ch);
}
dis.close();
returnValue = "Yahoo";
}
catch (Exception ex)
{
returnValue = ex.toString();
ex.printStackTrace();
}
return returnValue;
}
Instead of data streams you should just use the regular input and output streams. So instead of hc.openDataOutputStream() use hc.openOutputStream(). Data streams are for serializing Java objects to a stream, but you just want to write the raw bytes to the stream -- so a regular outputstream is what you want. Same for reading the response - just use the inputstream returned by hc.openInputStream()

HTTPS connection

i am using the following code for establishing Https connection
HttpsConnection httpConnector = null;
InputStream in = null;
Document doc ;
String content = "";
try
{
httpConnector = (HttpsConnection)Connector.open(url,Connector.READ_WRITE);
httpConnector.setRequestMethod(HttpConnection.GET) ;
in = httpConnector.openInputStream();
byte[] data = new byte[in.available()];
int len = 0;
int size = 0;
StringBuffer raw = new StringBuffer();
while ( -1 != (len = in.read(data)) ) {
raw.append(new String(data, 0, len));
size += len;
}
content = raw.toString().trim();
}
catch(Exception ex)
{
ex.printStackTrace();
return false;
}
try{
in.close();
in =null;
httpConnector.close();
httpConnector =null;
}catch(Exception ex)
{
Dialog.alert("Error:" + ex.getMessage());
return false;
}
}
i think i am able to establish the connection but the values are not coming. i am testing it on Simulator, i have not tested on device
I think your mistake is in the following line:
byte[] data = new byte[in.available()];
The available() method only returns how many bytes are immediately available for reading from the inputstream, but you are using it to initialize the size of the temporary byte array. Since it's possible that available() returns 0, you may be initializing a zero-length array.
It would be better to just initialize "data" with a fixed-length array.

Resources