Is that possible to upload any file(doc,pdf,img) in xamarin android using web service in c#. i am using sql server for data connection.
Use MediaFile from the Xamarin.Plugins.Media Library.
public static void upload(MediaFile mediaFile)
{
try
{
StreamContent scontent = new StreamContent(mediaFile.GetStream());
scontent.Headers.ContentDisposition = new ContentDispositionHeaderValue("form-data")
{
FileName = "newimage",
Name = "image"
};
scontent.Headers.ContentType = new MediaTypeHeaderValue("image/jpeg");
var client = new HttpClient();
var multi = new MultipartFormDataContent();
multi.Add(scontent);
client.BaseAddress = new Uri(Constants.API_ROOT_URL);
var result = client.PostAsync("api/photo", multi).Result;
Debug.WriteLine(result.ReasonPhrase);
}
catch (Exception e)
{
Debug.WriteLine(e);
}
}
And node.js code to receive request.
var formidable = require('formidable');
var util = require('util');
var fs = require('fs');
var multer = require('multer');
var storage = multer.diskStorage({
destination: function (req, file, callback) {
callback(null, './uploads');
},
filename: function (req, file, callback) {
console.log(file.fieldname);
callback(null, file.fieldname + '-' + Date.now());
}
});
var upload = multer({ storage: storage }).single('image');
exports.post = function (req, res) {
upload(req, res, function (err) {
console.log(req.file);
if (err) {
console.log("ERROR : "+err);
return res.end("Error uploading file.");
}
console.log("SUCCESS");
res.end("File is uploaded");
});
};
See complete information from below thread
https://forums.xamarin.com/discussion/18649/best-practice-to-upload-image-selected-to-a-web-api
https://forums.xamarin.com/discussion/5033/upload-file-from-device-to-server
Related
I am trying to post a file object to my nuxt 3 api route
Problem is:
Data from client has my file object
Data from server returns empty object
Screenshot of the issue
Where did my file object go?
const handleImageUpload = async (evt: Event) => {
const target = evt.target as HTMLInputElement
if (target.files) {
const file = target.files[0]
const upload: iUpload = {
name: file.name,
type: file.type,
file
}
console.log("data from client", upload)
try {
const { data, error } = await useFetch(constants.imageUploadApiUrl, {
headers: { "Content-type": "application/json" },
method: 'POST',
body: upload
})
console.log("data from server", data.value)
} catch (error) {
console.log(error)
}
}
}
constants.imageUploadApiUrl (api route) has the following
import { getQuery, readBody } from "h3"
import { iUpload } from "~~/helpers/interfaces"
export default defineEventHandler(async (event) => {
try {
const query = getQuery(event)
const body = await readBody(event) as iUpload
return { body }
} catch (error: any) {
return { error: error.message }
}
})
iUpload interface is this
export interface iUpload {
name: string;
type: string;
file: File;
}
I eventually got it working. Meanwhile it's using supabase as it's backend (forgot to mention that).
But here are the changes I made.
#1 - I added a utility function to convert the file to base64 string
export const getBase64 = (file: File) => {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
});
}
#2 - I updated the handleImageUpload function like below. The only change being in the file key
const handleImageUpload = async (evt: Event) => {
const target = evt.target as HTMLInputElement
if (target.files) {
const fileObj = target.files[0]
const upload: iUpload = {
path: id(memberName(store.selected), '-'),
name: fileObj.name,
file: await getBase64(fileObj) as string, // <**=**
type: fileObj.type
}
console.log("data from client", upload)
try {
const { data, error } = await useFetch(constants.imageUploadApiUrl, {
headers: { "Content-type": "multipart/form-data" },
method: 'POST',
body: upload
})
console.log("data from server", data.value)
} catch (error) {
console.log(error)
}
}
}
#3 - Furthermore I updated the server route as follows:
export default defineEventHandler(async (event) => {
try {
const body = await readBody(event) as iUpload
const filePath = `${body.path}/${body.name}`
const res = await fetch(body.file)
const blob = await res.blob()
const response = await supabase.storage
.from("pictures")
.upload(filePath, blob, {
contentType: body.type,
upsert: true,
})
return {
data: response.data,
error: response.error?.message,
}
} catch (error: any) {
return { error: error.message }
}
})
#4 - Lastly I updated the policies on supabase storage bucket and storage object to the following:
supabase storage policy update
Below I try to respond with a stream when I receive ticker updates.
+page.server.js:
import YahooFinanceTicker from "yahoo-finance-ticker";
const ticker = new YahooFinanceTicker();
const tickerListener = await ticker.subscribe(["BTC-USD"])
const stream = new ReadableStream({
start(controller) {
tickerListener.on("ticker", (ticker) => {
console.log(ticker.price);
controller.enqueue(ticker.price);
});
}
});
export async function load() {
return response????
};
Note: The YahooFinanceTicker can't run in the browser.
How to handle / set the response in the Sveltekit load function.
To my knowledge, the load functions cannot be used for this as their responses are JS/JSON serialized. You can use an endpoint in +server to return a Response object which can be constructed from a ReadableStream.
Solution: H.B. comment showed me the right direction to push unsollicited price ticker updates the client.
api route: yahoo-finance-ticker +server.js
import YahooFinanceTicker from "yahoo-finance-ticker";
const ticker = new YahooFinanceTicker();
const tickerListener = await ticker.subscribe(["BTC-USD"])
/** #type {import('./$types').RequestHandler} */
export function GET({ request }) {
const ac = new AbortController();
console.log("GET api: yahoo-finance-ticker")
const stream = new ReadableStream({
start(controller) {
tickerListener.on("ticker", (ticker) => {
console.log(ticker.price);
controller.enqueue(String(ticker.price));
}, { signal: ac.signal });
},
cancel() {
console.log("cancel and abort");
ac.abort();
},
})
return new Response(stream, {
headers: {
'content-type': 'text/event-stream',
}
});
}
page route: +page.svelte
<script>
let result = "";
async function getStream() {
const response = await fetch("/api/yahoo-finance-ticker");
const reader = response.body.pipeThrough(new TextDecoderStream()).getReader();
while (true) {
const { value, done } = await reader.read();
console.log("resp", done, value);
if (done) break;
result += `${value}<br>`;
}
}
getStream();
</script>
<section>
<p>{#html result}</p>
</section>
I would like my uploadFormPage() function to be able to take jpegs and pdf's. Is it possible for me to have 2 file types for the same FormData() const?`
export function uploadFormPage(documentId, formId, file, callback) {
return async dispatch => {
try {
const formData = new FormData();
formData.append('page', {
name: `document-${documentId}-${formId}-${Date.now()}.jpg`,
type: 'image/jpeg',
uri: file,
});
const result = await Api.uploadFiles(formData);
const entity = {
id: formId,
resourceKey: result.page,
};
const rsp = await Api.uploadFormPage(documentId, entity);
dispatch({type: LOAD_DOCUMENTS, data: rsp});
callback(null, rsp);
} catch (e) {
callback(e, null);
}
};
}
So, I am trying to upload big files in chunks using Angular 5 and the fileapi
This is how my request looks like:
const formData: FormData = new FormData();
formData.append('fileKey', this.fileToUpload, this.fileToUpload.name);
FileAPI.upload({
url: 'http://localhost:22166/api/lbx/upload',
headers: { 'authorization' : `Bearer ${this._svc.getToken()}`,
// 'Content-Disposition': `attachment; filename=${this.fileToUpload.name}; name=zip},
files: {
file: formData
},
// formData: true,
// chunkSize: 1.0 * FileAPI.MB,
progress: function (evt){
console.log((evt.loaded / evt.total) * 100);
},
complete: function (err, xhr){
if (err) {
const message = JSON.parse(xhr.responseText).ExceptionMessage;
parent.saving = false;
parent.snack(message);
} else {
console.log('completed');
}
}
});
This is what my Post looks like:
[HttpPost]
[Route("lbx/upload")]
[EnableCors(origins: "*", headers: "*", methods: "*", SupportsCredentials = true)]
public async Task<IHttpActionResult> UploadLBXProject()
{
try
{
if (!Request.Content.IsMimeMultipartContent("form-data"))
{
const string errorMsg = "Request content is not MIME multipart content and is unsupported.";
AppEventLog.Error(string.Format(errorMsg));
return StatusCode(HttpStatusCode.UnsupportedMediaType);
}
var root = HttpContext.Current.Server.MapPath("~/App_Data");
var result = await Request.Content.ReadAsMultipartAsync(new MultipartFormDataStreamProvider(root)); //here it fails
var uploadedFile = result.FileData.First();
var originalFileName = JsonConvert.DeserializeObject(uploadedFile.Headers.ContentDisposition.FileName).ToString();
var uploadedFileInfo = new FileInfo(uploadedFile.LocalFileName);
var part = ".part_1";
//Get the orignal file name that was on the client side for "BodyPart_"
var newZipFileName = uploadedFileInfo.FullName.Replace(uploadedFileInfo.Name, originalFileName + part);
var regex = new Regex("\\d+");
//If a zip file with the original name already exists, rename it
while (File.Exists(newZipFileName))
{
var s = newZipFileName.Split('.');
var np = ".part_" + (int.Parse(regex.Match(s[s.Length - 1]).Value) + 1);
newZipFileName = newZipFileName.Replace(part, np);
part = np;
}
var extractedFolderName = newZipFileName.Replace(".zip", "");
if (Directory.Exists(extractedFolderName))
{
Directory.Delete(extractedFolderName, true);
}
File.Move(uploadedFileInfo.FullName, newZipFileName);
AppEventLog.Info("upload of zip file finished");
var response = new HttpResponseMessage(HttpStatusCode.OK);
response.Content.Headers.Add("Access-Control-Expose-Headers", "X-Last-Known-Byte");
return ResponseMessage(response);
}
catch (Exception e)
{
//Delete the partially uploaded file if exception occured or if the user aborted
var uploadedFilesPath = HttpContext.Current.Server.MapPath("~/App_Data/");
foreach (var fileInfo in new DirectoryInfo(uploadedFilesPath)
.GetFiles())
{
File.Delete(fileInfo.FullName);
}
return InternalServerError(e);
}
}
This works fine without using the fileapi, but using this clientside code.
uploadLBX(file: File): Observable<any> {
const formData: FormData = new FormData();
formData.append('fileKey', file, file.name);
const query = this._config.WEB_API_BASE_URL + this._config._api_urls.post.lbx.uploadzip;
const headers = this.getHeaders();
return this._http.post(query, formData, { headers: headers })
.do(data => console.log(`file uploaded successfully`))
.catch(this.handleError);
}
However, it does not work with big files, since my web.config already has the maximum value for requests:
<system.web>
<compilation debug="true" targetFramework="4.5.2" />
<httpRuntime targetFramework="4.5.2" maxRequestLength="2147483647" />
</system.web>
The exception:
Unexpected end of MIME multipart stream. MIME multipart message is not
complete.
What can I do?
Thanks
EDIT
Added network request
I am using SocketIO and redis based on the following code,
var sub = redis.createClient();
var pub = redis.createClient();
sub.subscribe('chat');
io.use(socketHandshake({store: sessionStore, key:'jsessionid', secret:'secret', parser:cookieParser()}));
io.on('connection', function (socket) {
socket.on('chat', function (message) {
// io.emit('chat', "hello world");
pub.publish('chat', "hello world");
});
sub.on('message', function (channel, message) {
io.emit(channel, message);
});
});
This is the base code. I have modified the code so that if any user goes offline, in server side I am storing the messages in RSMQ(Redis Simple Message Queue) and when user comes online, the message is fetched from queue and emits to the user.I have used the following code to achieve this.I have stored the user status in an array.
var fs = require('fs')
, http = require('http')
, socketio = require('socket.io');
var redis = require('redis');
var store = redis.createClient();
var pub = redis.createClient();
var sub = redis.createClient();
RedisSMQ = require("rsmq");
rsmq = new RedisSMQ( {host: "127.0.0.1", port: 6379, ns: "rsmq"} );
var active_users=[];
var inactive_users=[];
var user_status=[];
var channel_users=[];
var users_queue=[];
var socket_ids=[];
var cname,qn;
var clients=[];
var server = http.createServer(function(req, res) {
res.writeHead(200, { 'Content-type': 'text/html'});
res.end(fs.readFileSync(__dirname + '/index.html'));
}).listen(9000, function() {
console.log('Listening at: http://localhost:9000');
});
socketio.listen(server).on('connection', function (socket) {
socket.on('login', function(data){
console.log('a user ' + data.userId + ' connected'+socket.id);
//saving userId to array with socket ID
active_users[socket.id] = data.userId;
socket_ids[data.userId]=socket.id;
clients[socket.id] = socket;
user_status[data.userId]="online";
});
socket.on('message', function (msg) {
console.log('Message Received: ', msg);
socket.broadcast.emit('message', msg);
});
socket.on('json', function (msg) {
if(msg.channel_name=='UserState'){
rsmq.listQueues(function(err,resp){
//console.log("QUEUES LIST"+resp);
});
if(msg.user_state=='active'){
store.hmset("active_users."+msg.sender_id,{"user":"online"});
user_status[msg.sender_id]="online";console.log(user_status);
if(users_queue[msg.sender_id]!=undefined && users_queue[msg.sender_id].length>0){
console.log("USERS QUEUE:"+users_queue[msg.sender_id]['0']);
for(var i=0;i<users_queue[msg.sender_id].length;i++){
cname=users_queue[msg.sender_id][i].split('_')[0];//get channel name from queue name
qn=users_queue[msg.sender_id][i];
rsmq.getQueueAttributes({qname:users_queue[msg.sender_id][i]},function(err,resp){
console.log("RESP:"+resp.msgs);
if(resp.msgs>0){ //if there are messages in queue......
for(var j=0;j<resp.msgs;j++){
rsmq.popMessage({qname:qn},function(err,resp){
console.log(resp);
var sid=socket_ids[msg.sender_id]; console.log("SOCKETID:"+sid); //get socket.id for the user
pub.publish(cname,resp.message);
});
}
}
});
}
}
}
else{
store.hmset("active_users."+msg.sender_id,{"status":"offline"});
user_status[msg.sender_id]="offline";
}
}
if(msg.channel_name=='ShareConversation'){
var channel=msg.conversations_data.conversation_id;//have to change to conversation_id or whatever channel.....
sub.subscribe(channel);
channel_users[channel]=[];
var m=msg.conversations_data.users.split(',');
for(var i=0;i<m.length;i++){
channel_users[channel].push(m[i]);
}
for(var i=0;i<channel_users[channel].length;i++){
var q=channel_users[channel][i].split('#')[0].replace(/(^\s+|\s+$)/g, '');
var queue_name=channel+"_"+q;console.log(queue_name);
var uname=channel_users[channel][i].replace(/(^\s+|\s+$)/g, '');
users_queue[uname]=[];
users_queue[uname].push(queue_name);
rsmq.createQueue({qname:queue_name}, function (err, resp) {
console.log(err);
console.log(queue_name);
if (resp===1) {
console.log("queue created");
}
});
}
}
socket.broadcast.emit('json', msg);
});
sub.on('message', function (channel, message) {
console.log("Message: " + message);
for(var i=0;i<channel_users[channel].length;i++){
var c=channel_users[channel][i].replace(/(^\s+|\s+$)/g, '');console.log("channel_users:"+channel_users[channel][i]);console.log("USER STATE :"+ user_status[c]);
if(user_status[c]=='offline'){
//send notification.........
//put messages in queue.......
var q=channel_users[channel][i].split('#')[0].replace(/(^\s+|\s+$)/g, '');
var queue_name=channel+"_"+q;console.log(queue_name);
rsmq.sendMessage({qname:queue_name, message:message}, function (err, resp) {
console.log(err);
if (resp) {
console.log("Message sent. ID:", resp);
}
});
}
}
socket.emit(channel, message);
});
});
This is my entire code. Here the issue is when the user goes offline, the message gets saved in queue multiple times and when the user comes online, the messages are received multiple times as there are duplicate messages saved in queue. How to overcome this.Please help....