Naudio mvc convert mp3 to wav - asp.net-mvc

I would like to know if it is possible to convert mp3 to wave using nAudio library but without saving converted file to disc (using for example MemoeryStream) ?
Link to nAudio
Any examples?
I tried like this:
byte[] fileStream = null;
MemoryStream ms2 = new MemoryStream();
using (WaveStream waveStream = WaveFormatConversionStream.CreatePcmStream(newMp3FileReader(filePath)))
using (WaveFileWriter waveFileWriter = new WaveFileWriter(ms2,waveStream.WaveFormat))
{
byte[] bytes3 = new byte[waveStream.Length];
waveStream.Position = 0;
waveStream.Read(bytes3, 0, (int)waveStream.Length);
waveFileWriter.Write(bytes3, 0, bytes3.Length);
fileStream = bytes3;
return fileStream;
}
When I saved file from byte array it is damaged.
Screen:

The simplest way to do it is to pass a MemoryStream instance to WaveFileWriter constructor, then write all the samples to the writer... For the decoding side you can use the Mp3FileReader...
Good luck!

Related

Lossing Quality after Compression using Twelvemonkeys in JAVA and highlighters are removed

I am converting a input file (PDF,TIFF) to Output (TIFF) file by using PDFBox (PDF to BufferedImage) and using twelve monkeys for converting Buffered image to Tiff file by resizing using Imagewriter with IIOImage.
File is converting but losing an quality on the image.And after changed the imagetype BufferedImage.TYPE_BYTE_GRAY to BufferedImage.TYPE_BYTE_BINARY my text highlighters on the file lost.
Below is the code used. How to convert the image without losing quality?
I am converting the image file size 1648*2338 with 200 dpi and i wanted to set photometric interpretation to min_is_white but not able to achieve my problem.
File inputFile = new File(inputImagePath);
BufferedImage inputImage = ImageIO.read(inputFile);
final int imageType = BufferedImage.TYPE_BYTE_BINARY;
// creates output image
BufferedImage outputImage = new BufferedImage(scaledWidth, scaledHeight,imageType);
// scales the input image to the output image
Graphics2D g2d = outputImage.createGraphics();
g2d.drawImage(inputImage, 0, 0, scaledWidth, scaledHeight, null);
g2d.dispose();
// writes to output file
final List<Entry> entries =new ArrayList<Entry>();
entries.add(new TIFFEntry(TIFF.TAG_X_RESOLUTION, new Rational(200)));
entries.add(new TIFFEntry(TIFF.TAG_Y_RESOLUTION, new Rational(200)));
entries.add(new TIFFEntry(TIFF.TAG_PHOTOMETRIC_INTERPRETATION, TIFF.TYPE_SHORT, 0));
final IIOMetadata tiffImageMetadata =new TIFFImageMetadata(entries);
ImageWriter writer = ImageIO.getImageWritersByFormatName("tiff").next();
FileImageOutputStream fio = new FileImageOutputStream(new File(outputImagePath));
ImageWriteParam params = writer.getDefaultWriteParam();
params.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
fio.setByteOrder(ByteOrder.LITTLE_ENDIAN);
IIOMetadata metadata = writer.getDefaultImageMetadata(new ImageTypeSpecifier(outputImage), params);
writer.setOutput(fio);
IIOImage iioimage = new IIOImage(outputImage, null, tiffImageMetadata);
writer.write(null, iioimage, params);
fio.close();
writer.dispose();

Xamarin - WCF Upload Large Files Report progress vis UIProgressView

I have create a WCF Service that allows uploading large files via BasicHttpBinding using streaming and it is working great! I would like to extended this to show a progress bar (UIProgressView) so that when a large file is being uploaded in 65k chunks, the user can see that it is actively working.
The client code calling the WCF Service is:
BasicHttpBinding binding = CreateBasicHttp ();
BTSMobileWcfClient _client = new BTSMobileWcfClient (binding, endPoint);
_client.UploadFileCompleted += ClientUploadFileCompleted;
byte[] b = File.ReadAllBytes (zipFileName);
using (new OperationContextScope(_client.InnerChannel)) {
OperationContext.Current.OutgoingMessageHeaders.Add(System.ServiceModel.Channels.MessageHeader.CreateHeader("SalvageId","",iBTSSalvageId.ToString()));
OperationContext.Current.OutgoingMessageHeaders.Add(System.ServiceModel.Channels.MessageHeader.CreateHeader("FileName","",Path.GetFileName(zipFileName)));
OperationContext.Current.OutgoingMessageHeaders.Add(System.ServiceModel.Channels.MessageHeader.CreateHeader("Length","",b.LongLength));
_client.UploadFileAsync(b);
}
On the server side, I read the file stream in 65k chuncks and do report back to the calling routine "bytes read", etc. A snippet of code for that is:
using (FileStream targetStream = new FileStream(filePath, FileMode.CreateNew,FileAccess.Write)) {
//read from the input stream in 65000 byte chunks
const int chunkSize = 65536;
byte[] buffer = new byte[chunkSize];
do {
// read bytes from input stream
int bytesRead = request.FileData.Read(buffer, 0, chunkSize);
if (bytesRead == 0) break;
// write bytes to output stream
targetStream.Write(buffer, 0, bytesRead);
} while (true);
targetStream.Close();
}
But I don't know how to hook into the callback on the Xamarin side to receive the "bytes read" versus "total bytes to send" so I can update the UIProgressView.
Has anyone tried this or is this even possible?
Thanks In Advance,
Bo

iOS with monotouch upload wav file fails

I'm writing an iOS app with monotouch.
I'm recording a sound file via the AVrecorder. I create a working wav file(which I check).
I want to send it to my server by passing it as a byte array (seems to be the best way)
The problem is that the file passed to the server is corrupted.
This is how I convert wav to byte []. Any ideas what I'm doing wrong?
(I iterate to find the files size once and than copy everything. The reason I do this is that the GetProperty returns a wrong size of file).
string path = GlobalData.UserDefaults.StringForKey(GlobalData.HeykuSoundPathDefault);
AudioFile song = AudioFile.Open(path,AudioFilePermission.Read,AudioFileType.WAVE);
byte [] audioBuffer = new byte[4000000];
int copied = 0;
int offset = 0;
while(true)
{
copied = song.Read(offset,audioBuffer,0,4096,false);
if(copied == -1)
{
break;
}
else
{
offset += copied;
Console.WriteLine(offset);
}
}
audioBuffer = new byte[offset];
copied = song.Read(0,audioBuffer,0,offset,false);
Have you tried simply using streams? Something like this:
using (var fileReader = new StreamReader("filename_here"))
{
using (var webStream = webClient.OpenWrite("http://www.myuploader.com"))
{
fileReader.BaseStream.CopyToAsync(webStream);
}
}
The above method should copy the whole file including the Wave headers. I suspect the AudioFile could be returning you pure PCM audio which will not include headers. This is an example of a header writer function:
private void WriteHeader()
{
this.writer.Seek(0, SeekOrigin.Begin);
// chunk ID
this.writer.Write('R');
this.writer.Write('I');
this.writer.Write('F');
this.writer.Write('F');
this.writer.Write(this.ByteCount);
this.writer.Write('W');
this.writer.Write('A');
this.writer.Write('V');
this.writer.Write('E');
this.writer.Write('f');
this.writer.Write('m');
this.writer.Write('t');
this.writer.Write(' ');
this.writer.Write((int)16);
this.writer.Write((short)1);
this.writer.Write((short)this.ChannelCount);
this.writer.Write((int)this.SampleRate);
this.writer.Write((int)this.SampleRate*2);
this.writer.Write((short)2);
this.writer.Write((short)this.BitsPerSample);
this.writer.Write('d');
this.writer.Write('a');
this.writer.Write('t');
this.writer.Write('a');
this.writer.Write((int)this.byteCount);
}
More info on the file format: https://ccrma.stanford.edu/courses/422/projects/WaveFormat/
As for reading the data from the AudioFile, this would probably work better:
var song = AudioFile.Open(path,AudioFilePermission.Read,AudioFileType.WAVE);
var audioBuffer = new byte[song.Length];
int copied = 0;
while (copied < audioBuffer.Length)
{
copied += song.Read (copied, audioBuffer, copied, 4096, false);
}

How to send image stored in RMS to server in j2me?

I want to send the image stored in the RMS to server. For that I have stored the captured image in the RMS. I can access it successfully and can show it over device, but when I used to send it to server, that time over the server only name of image appears but the image is not generating.
here is the line code that I am trying to use
byte[] byteArrRec = LoadImagesFromRMS.objImageRecordStore.getRecord(recID);
ByteArrayInputStream bin = new ByteArrayInputStream(byteArrRec);
DataInputStream din = new DataInputStream(bin);
int width = din.readInt();
int height = din.readInt();
int length = din.readInt();
int[] rawImg = new int[width * height];
for (int itemp = 0; itemp < length; itemp++) {
rawImg[itemp] = din.readInt();
}
Image tempImage = Image.createRGBImage(rawImg, width, height, false);
byteArr = get_Byte_Array(tempImage);
byteArr = get_Byte_Array(tempImage);
Then I have passed the byteArray using post method over the server.
But the Image is not been generated, Did any one have any idea about this?
First need to read all bytes from response and store in one variable (bytearray) of byte array. Then after that write this code
Create a ByteArrayInputStream from your byte array and then use ImageIO class to read image from that stream.
InputStream in = new ByteArrayInputStream(bytearray);
BufferedImage image = ImageIO.read(in);
Thanks
you need to create HttpConnection with the remote server, after creating connection , create a DataOutputStream variable associated with the HttpConnection variable.
Now write byte array into that DataOutputStream variable and send it as "POST" method. If byte array's size is very big the try to send it in chunks..

Can we compress a large file as a chunk data for GZIP in blackberry?

I saw the sample APIas below
public static byte[] compress( byte[] data )
{
try
{
ByteArrayOutputStream baos = new ByteArrayOutputStream();
GZIPOutputStream gzipStream = new GZIPOutputStream( baos, 6, GZIPOutputStream.MAX_LOG2_WINDOW_LENGTH );
gzipStream.write( data );
gzipStream.close();
}
catch(IOException ioe)
{
return null;
}
return baos.toByteArray();
}
But when I tried to compress with a large file with Curve 8900 OS 4.6, I got a "OutOfMemoryError" so I would like to know that how to compress as a chunk small data?
I already tried with this code as below but it doesn't work, compressed file cannot decompress...
file = (FileConnection)Connector.open(_fileOutputPath, Connector.READ_WRITE);
if (!file.exists()) {
file.create();
}
os = file.openOutputStream();
is = FileUtil.getInputStream(_fileInputPath, 0);
int tmpSize = 1024;
byte[] tmp = new byte[tmpSize];
int len = -1;
gzipStream = new GZIPOutputStream( os, 6, GZIPOutputStream.MAX_LOG2_WINDOW_LENGTH );
while((len = is.read(tmp, 0, tmpSize)) != -1) {
gzipStream.write(tmp, 0, len);
}
GZIPOutputStream does not produce a file suitable for use with the gzip command line tool. This is because it doesn't produce the necessary file headers. How did you test decompressing it? You should write a similar Java program that makes use of GZIPInputStream to test, as 'gunzip' is not going to recognize the input.
The problem of the first code sample is that the ByteArrayOutputStream is getting too big for the limited memory of a mobile device.
An option could be to first write to a file (for instance) on SD card.
The second code sample seems fine, but see Michael's answer.

Resources