I'm trying to use the AFNetworking library to upload a file to Clypit. I've looked at their documentation here: https://github.com/AFNetworking/AFNetworking
And have configured my code to upload an audio file like so:
NSIndexPath *cellIndexPath = [self.tableView indexPathForCell:cell];
UITableViewCell *cell = [self.tableView cellForRowAtIndexPath:cellIndexPath];
ICIRecordingCell *c = (ICIRecordingCell *)cell;
NSString *fileName = c.title.text;
NSURL *filePath = [NSURL fileURLWithPath:fileName];
NSMutableURLRequest *request = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST" URLString:#"http://upload.clyp.it/upload"
parameters:nil
constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
[formData appendPartWithFileURL:filePath name:#"audioFile" fileName:#"audio.m4a" mimeType:#"audio/m4a" error:nil];
} error:nil];
AFURLSessionManager *manager = [[AFURLSessionManager alloc] initWithSessionConfiguration:[NSURLSessionConfiguration defaultSessionConfiguration]];
NSProgress *progress = nil;
NSURLSessionUploadTask *uploadTask = [manager uploadTaskWithStreamedRequest:request progress:&progress completionHandler:^(NSURLResponse *response, id responseObject, NSError *error) {
if (error) {
NSLog(#"Error: %#", error);
} else {
NSLog(#"%# %#", response, responseObject);
}
} ];
[uploadTask resume];
But neither responseObject or error are ever firing. I'm not even certain if this is the type of request I should be sending. Clypit Api says they accept request as follows:
Upload an audio file. Accepted file types are mp3, ogg, m4a, wav,
aiff, aif and 3gpp. Regardless of file type when uploading, a
resulting mp3 and ogg will be created and made available. The audio
file can be added to a playlist by providing a playlistId and the
playlistUploadToken. Otherwise, a new playlist will automatically be
created. A playlist can contain a maximum of 20 audio files. The title
of the audio file will be set to that of the name of the uploaded
file.
Parameters: audioFile - The binaries of the audio file. playlistId -
Optional. The playlist that the audio file will be added to. If this
value is specified, the correct playlistUploadToken must also be
included in the request. If this value is not specified, a playlist
will be automatically created. playlistUploadToken - Optional. Given
to you after you create a playlist. When adding an audio file to an
already existing playlist, this value must be provided. order -
Optional. The position in which you would like this audio file to
appear in the playlist. description - Optional. The description of the
audio file. Maximum allowed length is 420 characters. longitude -
Optional. The longitude of where the audio file was recorded. If
passed in, latitude becomes required. Value must be between -15069 and
15069 degrees. latitude - Optional. The latitude of where the audio
file was recorded. If passed in, longitude becomes required. Value
must be between -90 and 90 degrees.
Uploads are done via a multipart/form-data POST. Consider the
following form:
It will create a request that looks like this: POST
http://upload.clyp.it/upload HTTP/1.1 Host: upload.clyp.it
Connection: keep-alive Content-Type: multipart/form-data;
boundary=---------------------------21632794128452 Content-Length:
5005
-----------------------------21632794128452 Content-Disposition: form-data; name="audioFile"; filename="MyAudioFile.mp3" Content-Type:
audio/mpeg (Audio file data goes here)
Am I using the right approach? Thanks in advance
Related
I have downloaded a json file (data.json) using NSURLSession. I am trying to access this file from a local html file (main project folder) myfile.html which is displayed via UIWebView. From NSLog I have identified the temp file location is here:
file:///Users/administrator/Library/Developer/CoreSimulator/Devices/166B438D-4F67-448C-B0E5-B32438DA3BF9/data/Containers/Data/Application/FCA61482-BC5D-4804-91F8-891EAB4DB145/tmp/CFNetworkDownload_g6961Q.tmp
My question is how should I access this temp file from the local 'top level' html file, using a relative path?
More generally therefore - how does one access the iOS application file structure from such a top level (relative to the project) html file?
Some background info:
When I manually copy data.json to XCode and access it from myfile.html using the relative path 'data.json' it works.
The download code is:
NSURL *URL = [NSURL URLWithString:#"http://myurl.com/data.json"];
NSURLRequest *request = [NSURLRequest requestWithURL:URL];
NSURLSession *session = [NSURLSession sharedSession];
NSURLSessionDownloadTask *downloadTask = [session downloadTaskWithRequest:request completionHandler:
^(NSURL *location, NSURLResponse *response, NSError *error) {
}
The reason I am doing this with NSURLSession is that I can't do it with Javascript in the html file since it is a cross domain request.
EDIT: Thanks for the helpful answers about adding HTML as a string into the UIWebView. I can see that trying to reference a temp file from the local HTML file is a bit unwieldly and may run counter to Apple's preference of not exposing filepaths - but I am still interested in a definitive answer on whether it can be done.
Perhaps you could inject the JSONData into the web page once it has finished loading within the web view. Something like this (I have not tested it):
- (void)webViewDidFinishLoad:(UIWebView *)webView
{
NSError *error = nil;
NSURL *URL = [NSURL URLWithString:#"http://myurl.com/data.json"];
NSString *result = [NSString stringWithContentsOfURL:URL
encoding:NSUTF8StringEncoding error:&error];
if (!error && result.length > 0)
{
[webView stringByEvaluatingJavaScriptFromString:[NSString stringWithFormat:#"var jsonData = \"%#\"", result]];
}
else
{
NSLog(#"Oops: %#", error);
}
}
I don't think you can either. Instead of loading your resource HTML as-is, use it as a "template" and put a "placeholder" string where you need the link to your temp file.
Then, load your HTML into a (mutable) string, replace the placeholder for the actual path to the file and pass this modified HTML string to your web view.
Currently I'm using the following code to upload videos:
NSURLRequest *urlRequest = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST" URLString:[[entity uploadUrl]absoluteString] parameters:entity.params constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
[UploadModel getAssetData:entity.asset resultHandler:^(NSData *filedata) {
NSString *mimeType =[FileHelper mimeTypeForFileAtUrl:entity.fileUrl];
// NSError *fileappenderror;
[formData appendPartWithFileData:filedata name:#"data" fileName: entity.filename mimeType:mimeType];
}];
} error:&urlRequestError];
GetAssetData method
+(void)getAssetData: (PHAsset*)mPhasset resultHandler:(void(^)(NSData *imageData))dataResponse{
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestAVAssetForVideo:mPhasset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
NSURL *localVideoUrl = [(AVURLAsset *)asset URL];
NSData *videoData= [NSData dataWithContentsOfURL:localVideoUrl];
dataResponse(videoData);
}
}];
}
The problem with this approach that an app simply runs out of memory whenever large/multiple video files are being uploaded. I suppose it's due to requesting the NSDATA (aka filedata ) for uploading of a file(see in the method above). I've tried to request the file path using method
appendPartWithFileURL intead of appendPartWithFileData
it works on an emulator. and fails on a real device with an error that it can't read the file by the path specified. I've described this issue here
PHAsset + AFNetworking. Unable to upload files to the server on a real device
=======================================
Update: I've modified my code in order to test approach of uploading file by the local path on a new iPhone 6s+ as follows
NSURLRequest *urlRequest = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST" URLString:[[entity uploadUrl]absoluteString] parameters:entity.params constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
NSString *mimeType =[FileHelper mimeTypeForFileAtUrl:entity.fileUrl];
NSError *fileappenderror;
[formData appendPartWithFileURL:entity.fileUrl name:#"data" fileName:entity.filename mimeType:mimeType error:&fileappenderror];
if (fileappenderror) {
[Sys MyLog: [NSString stringWithFormat:#"Failed to append: %#", [fileappenderror localizedDescription] ] ];
}
} error:&urlRequestError];
Testing on iPhone 6s+ gives a more clear log warning It occurs as the result of invoking method appendPartWithFileURL
<Warning>: my_log: Failed to append file: The operation couldn’t be completed. File URL not reachable.
deny(1) file-read-metadata /private/var/mobile/Media/DCIM/100APPLE/IMG_0008.MOV
15:41:25 iPhone-6s kernel[0] <Notice>: Sandbox: My_App(396) deny(1) file-read-metadata /private/var/mobile/Media/DCIM/100APPLE/IMG_0008.MOV
15:41:25 iPhone-6s My_App[396] <Warning>: my_log: Failed to append file: The file “IMG_0008.MOV” couldn’t be opened because you don’t have permission to view it.
Here is The code used to fetch the local file path from PHAsset
if (mPhasset.mediaType == PHAssetMediaTypeImage) {
PHContentEditingInputRequestOptions * options = [[PHContentEditingInputRequestOptions alloc]init];
options.canHandleAdjustmentData = ^BOOL(PHAdjustmentData *adjustmeta){
return YES;
};
[mPhasset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput * _Nullable contentEditingInput, NSDictionary * _Nonnull info) {
dataResponse(contentEditingInput.fullSizeImageURL);
}];
}else if(mPhasset.mediaType == PHAssetMediaTypeVideo){
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestAVAssetForVideo:mPhasset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
NSURL *localVideoUrl = [(AVURLAsset *)asset URL];
dataResponse(localVideoUrl);
}
}];
}
So the issue remains the same - files uploaded to the server are empty
The proposed solution above is correct only partially(and it was found by myself before). Since The system doesn't permit to read files outside of sandbox therefore the files cannot be accessed(read/write) by the file path and just copied. In the version iOS 9 and above Photos Framework provides API(it cannot be done through the NSFileManager , but only using Photos framework api) to copy the file into your App's sandbox directory. Here is the code which I used after digging in docs and head files.
First of all copy a file to the app sandbox directory.
// Assuming PHAsset has only one resource file.
PHAssetResource * resource = [[PHAssetResource assetResourcesForAsset:myPhasset] firstObject];
+(void)writeResourceToTmp: (PHAssetResource*)resource pathCallback: (void(^)(NSURL*localUrl))pathCallback {
// Get Asset Resource. Take first resource object. since it's only the one image.
NSString *filename = resource.originalFilename;
NSString *pathToWrite = [NSTemporaryDirectory() stringByAppendingString:filename];
NSURL *localpath = [NSURL fileURLWithPath:pathToWrite];
PHAssetResourceRequestOptions *options = [PHAssetResourceRequestOptions new];
options.networkAccessAllowed = YES;
[[PHAssetResourceManager defaultManager] writeDataForAssetResource:resource toFile:localpath options:options completionHandler:^(NSError * _Nullable error) {
if (error) {
[Sys MyLog: [NSString stringWithFormat:#"Failed to write a resource: %#",[error localizedDescription]]];
}
pathCallback(localpath);
}];
} // Write Resource into Tmp
Upload Task Itself
NSURLRequest *urlRequest = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST"
URLString:[[entity uploadUrl]absoluteString] parameters:entity.params constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
// Assuming PHAsset has only one resource file.
PHAssetResource * resource = [[PHAssetResource assetResourcesForAsset:myPhasset] firstObject];
[FileHelper writeResourceToTmp:resource pathCallback:^(NSURL *localUrl)
{
[formData appendPartWithFileURL: localUrl name:#"data" fileName:entity.filename mimeType:mimeType error:&fileappenderror];
}]; // writeResourceToTmp
}// End Url Request
AFHTTPRequestOperation * operation = [[AFHTTPRequestOperation alloc ] initWithRequest:urlRequest];
//.....
// Further steps are described in the AFNetworking Docs
This upload Method has a significant drawback .. you are screwed if device goes into "sleep mode".. Therefore the recommended upload approach here is to use method .uploadTaskWithRequest:fromFile:progress:completionHandler in AFURLSessionManager.
For versions below iOS 9.. In Case of images You can fetch the NSDATA from PHAsset as shown in the code of my question.. and upload it. or write it first into your app sandbox storage before uploading. This approach is not usable in case of large files. Alternatively you may want to use Image/video picker exports files as ALAsset. ALAsset provides api which allows you to read file from the storage.but you have to write it to the sandbox storage before uploading.
Creating NSData for every video might be bad, because videos ( or any other files ) can be much bigger than RAM of the device,
I'd suggest to upload as "file" and not as "data", if you append file, it will send the data from the disk chunk-by-chunk and won`t try to read whole file at once, try using
- (BOOL)appendPartWithFileURL:(NSURL *)fileURL
name:(NSString *)name
error:(NSError * __autoreleasing *)error
also have a look at https://github.com/AFNetworking/AFNetworking/issues/828
In your case , use it with like this
NSURLRequest *urlRequest = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:#"POST" URLString:[[entity uploadUrl]absoluteString] parameters:entity.params constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
[formData appendPartWithFileURL:entity.fileUrl name:entity.filename error:&fileappenderror];
if(fileappenderror) {
NSLog(#"%#",fileappenderror);
}
} error:&urlRequestError];
See my answer here to your other question. Seems relevant here as the subject matter is linked.
In that answer I describe that in my experience Apple does not give us direct access to video source files associated with a PHAsset. Or an ALAsset for that matter.
In order to access the video files and upload them, you must first create a copy using an AVAssetExportSession. Or using the iOS9+ PHAssetResourceManager API.
You should not use any methods that load data into memory as you'll quickly run up against OOM exceptions. And it's probably not a good idea to use the requestAVAssetForVideo(_:options:resultHandler:) method as stated above because you will at times get an AVComposition as opposed to an AVAsset (and you can't get an NSURL from an AVComposition directly).
You also probably don't want to use any upload method that leverages AFHTTPRequestOperation or related APIs because they are based on the deprecated NSURLConnection API as opposed to the more modern NSURLSession APIs. NSURLSession will allow you to conduct long running video uploads in a background process, allowing your users to leave your app and be confident that the upload will complete regardless.
In my original answer I mention VimeoUpload. It's a library that handles video file uploads to Vimeo, but its core can be repurposed to handle concurrent background video file uploads to any destination. Full disclosure: I'm one of the library's authors.
So I am using Parse (which is pretty sweet) and I'm in the process of downloading files (short video files - no more then 1mb) from the parse server to my application to play. Now the way it works is (via documentation)..
PFFile* videoFile = [[tempArray objectAtIndex:i] objectForKey:#"track"];
[videoFile getDataInBackgroundWithBlock:^(NSData *data, NSError *error) {
if (!error) {
NSString* dataString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
NSURL* videoURL = [NSURL URLWithString:dataString];
// now do something with this videoURL (i.e. play it!)
[data writeToFile:#"trackFile" atomically:YES];
NSURL *filePath = [NSURL fileURLWithPath:#"trackFile"];
NSLog(#"File Path: %#",filePath);
AVAsset* asset = [AVAsset assetWithURL:filePath];
AVPlayerItem* playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
}
}
On download completion you are suppossed to create a string from the data and then a url from the string. Only problem is - the dataString always returns NULL/nil. I have confirmed that the data property is not empty and does in fact hold the video data. Why is this happening? Any help would be greatly appreciated. Thanks in advance!
I have confirmed that the data property is not empty and does in fact hold the video data.
Video data is not a UTF-8 string. It's definitely not a UTF-8 string representation of an URL. So when you say "interpret this video data as UTF-8," Cocoa rightly responds that it is not UTF-8 (because it's video data).
The simplest solution is to write this to disk, and then play the file.
Couple things that need to be determined first:
does the NSData object actually have any data, i.e., data.length > 0?
does the NSData object hold video data OR the url of the video data (not exactly clear)?
Making the assumption that the NSData object is holding the url and it's length is greater than 0, then you might want to try:
NSString* dataString = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
I am very new to iOS development, but I would like to make an app that has two table view controllers (columns): both are a row of images that act as links. The first would be a column of YouTube videos and the second a column of websites. I would like to have all these listed in a file file.txt listed like so: V, http://youtube.com/example W, http://example.com
There would be a long list of those, the V meaning its a video (for the video column) and W for the websites. Now, I understand how to being the single file in, but what happens afterwards is my concern. Can I read each line into some sort of queue and then fire the NSURL request for each one consecutively? How can that be done with NSURL? Is there perhaps a better approach?
There are two questions for me:
Is a text file really the best format?
I might suggest a plist or archive (if the file is only going to exist only in your app's bundle and/or documents folder) or JSON (if it's going to live on a server before delivering it to the user) instead of a text file. It will make it easier to parse this file than a text file. For example, consider the following dictionary:
NSDictionary *dictionary = #{#"videos" : #[#"http://youtube.com/abc", #"http://vimeo.com/xyz"],
#"websites": #[#"http://apple.com", #"http://microsoft.com"]};
You can save that to a plist with:
NSString *documentsPath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
NSString *plistPath = [documentsPath stringByAppendingPathComponent:#"files.plist"];
[dictionary writeToFile:plistPath atomically:YES];
You can add that file to your bundle or whatever, and then read it at a future date with:
dictionary = [NSDictionary dictionaryWithContentsOfFile:plistPath];
You can, alternatively, write that to a JSON file with:
NSError *error = nil;
NSData *data = [NSJSONSerialization dataWithJSONObject:dictionary options:NSJSONWritingPrettyPrinted error:&error];
NSString *jsonPath = [documentsPath stringByAppendingPathComponent:#"files.json"];
[data writeToFile:jsonPath atomically:YES];
You can read that JSON file with:
data = [NSData dataWithContentsOfFile:jsonPath];
dictionary = [NSJSONSerialization JSONObjectWithData:data options:0 error:&error];
Either way, you can get the list of videos or web sites like so:
NSArray *videos = dictionary[#"videos"];
NSArray *websites = dictionary[#"websites"];
Now that you have your arrays of videos and websites, the question then is how you then use those URLs.
You could do something like:
for (NSString *urlString in videos) {
NSURL *url = [NSURL URLWithString: urlString];
// now do something with the URL
}
The big question is what is the "do something" logic. Because you're dealing with a lot of URLs, you would want to use a NSOperation based solution, not a GCD solution, because NSOperationQueue lets you control the degree of concurrency. I'd suggest a NSOperation-based networking library like AFNetworking. For example, to download the HTML for your websites:
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
queue.maxConcurrentOperationCount = 4;
for (NSString *urlString in websites)
{
NSURL *url = [NSURL URLWithString:urlString];
NSURLRequest *request = [NSURLRequest requestWithURL:url];
AFHTTPRequestOperation *operation = [[AFHTTPRequestOperation alloc] initWithRequest:request];
[operation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
// convert the `NSData` responseObject to a string, if you want
NSString *string = [[NSString alloc] initWithData:responseObject encoding:NSUTF8StringEncoding];
// now do something with it, like saving it in a cache or persistent storage
// I'll just log it
NSLog(#"responseObject string = %#", string);
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
NSLog(#"error = %#", error);
}];
[queue addOperation:operation];
}
Having said that, I'm not sure it makes sense to kick off a ton of network requests. Wouldn't you really prefer to wait until the user taps on one of those cells before retrieving it (and for example, then just open that URL in a UIWebView)? You don't want an app that unnecessarily chews up the user's data plan and battery retrieving stuff that they might not want to retrieve. (Apple has rejected apps that request too much data from a cellular connection.) Or, at the very least, if you want to retrieve stuff up front, only retrieve stuff as you need it (e.g. in cellForRowAtIndexPath), which will retrieve the visible rows, rather than the hundreds of rows that might be in your text/plist/json file.
Frankly, we need a clearer articulation of what you're trying to do, and we might be able to help you with more concise counsel.
I have an mp3 file on a server. I want to get this file's information like what's the size of this file, what's the artists name, what's the album name, when was the file created, when was it modified, etc. I want all this information.
Is it possible to get this information without actually downloading the whole file? Using NSURLConnection or otherwise?
EDIT:
The following code doesn't give me the required information, i.e. file created by, artist name, etc
NSError *rerror = nil;
NSURLResponse *response = nil;
NSURL *url = [NSURL URLWithString:#"http://link.to.mp3"];
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url];
[request setHTTPMethod:#"HEAD"];
NSData *result = [NSURLConnection sendSynchronousRequest:request returningResponse:&response error:&rerror];
NSString *resultString = [[[NSString alloc] initWithData:result encoding:NSUTF8StringEncoding] autorelease];
NSLog(#"URL: %#", url);
NSLog(#"Request: %#", request);
NSLog(#"Result (NSData): %#", result);
NSLog(#"Result (NSString): %#", resultString);
NSLog(#"Response: %#", response);
NSLog(#"Error: %#", rerror);
if ([response isMemberOfClass:[NSHTTPURLResponse class]]) {
NSLog(#"AllHeaderFields: %#", [((NSHTTPURLResponse *)response) allHeaderFields]);
}
The "AllHeaderFields" is:
AllHeaderFields: {
"Cache-Control" = "max-age=0";
Connection = "keep-alive";
"Content-Encoding" = gzip;
"Content-Type" = "text/plain; charset=ascii";
Date = "Fri, 17 Feb 2012 12:44:59 GMT";
Etag = 19202n;
Pragma = public;
Server = dbws;
"x-robots-tag" = "noindex,nofollow";
}
It is quite possible to get the ID3 information embedded in an MP3 file (artist name, track title) without downloading the whole file or using low-level APIs. The functionality is part of the AVFoundation framework.
The class to look at is AVAsset and specifically it's network friendly subclass AVURLAsset. AVAsset has an NSArray property named commonMetadata. This commonMetadata property will contain instances of AVMetadataItem, assuming of course that the reference URL contains metadata. You will usually use the AVMetadataItem's commonKey property to reference the item. I find this method of iterating through an array checking commonKeys irritating so I create an NSDictionary using the commonKey property as the key and the value property as the object. Like so:
-(NSDictionary *)dictionaryOfMetadataFromAsset:(AVAsset *)asset{
NSMutableDictionary *metaData = [[NSMutableDictionary alloc] init];
for (AVMetadataItem *item in asset.commonMetadata) {
if (item.value && item.commonKey){
[metaData setObject:item.value forKey:item.commonKey];
}
}
return [metaData copy];
}
With the addition of this simple method the AVAsset's metadata becomes quite easy to use. Here is an example of getting an MP3's metadata through a URL:
NSURL *mp3URL = [NSURL URLWithString:#"http://'AddressOfMP3File'"];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:mp3URL options:nil];
NSDictionary *metaDict = [self dictionaryOfMetadataFromAsset:asset];
NSLog(#"Available Metadata :%#",metaDict.allKeys);
NSLog(#"title:%#",[metaDict objectForKey:#"title"]);
I have found that this code seems to load just the first few seconds of your MP3 file. Also note that this code is synchronous; So use with caution. But AVURLAsset does have some async functionality described in the docs.
Once you have the AVAsset you can create a AVPlayerItem with it and feed that to an AVPlayer and play it, or not.
Yes and no. Things like the file size and modification date often come as part of the HEAD response. But not always: with a lot of dynamic URLs, you won't get all of the information.
As for the artist and album name, they're part of the MP3's ID3, which is contained inside the file, and so you won't be able to get them with a HEAD request. Since the ID3 tag is typically at the beginning of a file, you could try to grab just that part and then read the ID3 tag. But you won't be able to do it with NSURLConnection since it doesn't support just fetching part of a file, so you'll need to find a more low-level way of getting data by HTTP.
Yep, you're right on target with NSURLConnection.
I think you want to send a HEAD request for the resource you want information about and then check the information you receive in connection:didReceiveResponse: and connection:didReceiveData:
Edit
Admittedly I didn't read your question in its entirety. It won't be possible to get ID3 information, but you should be able to get size of file and maybe creation date etc.
This answer does give some good information about how to get the ID3 information. You'd need to set up a php page to examine the mp3 file server-side and return just that information you require instead of the entire mp3.