I am trying to access a file on another server from my application. Out of application, I am able to access the files from windows explorer, but when I am using the same path in my application, I am getting the error "Could not find a part of the path F:\Unknown\ABC\DEF\MNO\Fren.jpg".
My code goes here..
String FilePath;
FilePath = Request.FilePath("\\\\ABC\\DEF\\MNO\\Fren.jpg");
System.Net.Mail.Attachment a = new System.Net.Mail.Attachment(FilePath);
what is the problem in my code?
Usually this is actually a permissions issue (not a file-not-found issue). Make sure that the account you are using for read access actually (like the IIS account, or ASPNET account, if this is done from within IIS) has permissions to read the file.
--Dave
Related
I've been trying to open an excel file in my ASP.NET MVC project (I use Syncfusion library for that).
For some reason, when I publish the webapp with Visual Studio, I get the
"Access to the path 'D:\home\site\wwwroot' is denied."
error.
The file is correctly uploaded to the webserver (I can download it just fine when I input the path in the browser). I've looked into this question, but there is no WEBSITE_RUN_FROM_PACKAGE setting in my azure webapp. I went into Kudu, opened up the console and ran
icacls "D:\home\site\wwwroot"
which gave this output
D:\home\site\wwwroot Everyone:(I)(OI)(CI)(M,DC)
BUILTIN\Administrators:(I)(OI)(CI)(F)
So, with my little understanding of acls, it seems that Everyone has Modify permission to this folder.
Which means that I'm at a loss as to what to do.
Code:
IApplication application = excelEngine.Excel;
application.DefaultVersion = ExcelVersion.Excel2013;
application.EnablePartialTrustCode = true;
IWorkbook workbook = application.Workbooks.Create(1);
var dir = Path.GetDirectoryName(Assembly.GetExecutingAssembly().CodeBase);
dir = dir.Replace("file:\\", "");
dir = dir.Replace("\\bin", "");
var dirXlsx = "\\Areas\\Stampe\\Content\\Excel\\RapportoProvaMare(version1).xlsx";
FileStream s = File.OpenRead(dir + dirXlsx);
IWorkbook source = excelEngine.Excel.Workbooks.Open(s);
I don't think the Syncfusion code has anything to do with the issue, I'm just reporting it for completeness.
Anybody have an idea as to what the issue could be?
EDIT: To be more specific, I'm publishing the code via Visual Studio, with the excel file in a folder of the website (just like you would do for an image). All the test I've made to the dir variable have been tested on the Azure environment (we have a duplicate test environment). I just need to read the excel file from the local website folder and process it before sending it to the user.
Please use %HOME% for persistent file storage or %TMP% for temporal storage
You can create a %HOME% environment variable in your local computer for test purposes.
I have a file stream that I want to store in iOS under a subfolder under the SpecialFolder enum.
The FileStream constructor wants the subfolder structure to exist first and when I try to create it, I get UnauthorizedAccessException.
Suppose my intended location is
Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments) + "/folder1/file1.xls";
which, in my session, resolves to
/var/mobile/Containers/Data/Application/035ECE7D-0E9F-4DF9-927B-B79FB31AEE01/Documents/folder1/file1.xls
Then I make sure the location exists
if (!Directory.Exists(filepath))
{
Directory.CreateDirectory(filepath);
}
which, according to Microsoft, should work
Instead of having the subfolder created and the file stream happily saving the file into it, the CreateDirectory method throws this:
{System.UnauthorizedAccessException: Access to the path "/var/mobile/Containers/Data/Application/035ECE7D-0E9F-4DF9-927B-B79FB31AEE01" is denied.
I thought MyDocuments was free to do stuff in? I've seen no documentation that says I have to apply for permission first. Where should I be creating folders?
If you code with Visual Studio, try to run the program in administrator (Visual Studio), after run your project. Maybe it’s the good solution.
You can try to create platform specific classes for working with saving files and use DependencyService for calling your methods. For iOS app I used this folder enum: Environment.SpecialFolder.Personal and that worked for me.
Ok I know Maximo 5.2 is horribly outdated but currently I just want to get the attachment working. So here is the situation:
I have an old server (running 2000) with a folder being shared on the network with the name of F$. I have tried that the attachment uploading functionality works fine: users can attach any file to a work order and that file will be copied into a specified folder in the F: drive of the server. But when I try to access to that file from the client side -- that is, click on the link within the work order (from attachment tab in Maximo webapp) in order to view the attachment -- I get a 404 response. So in a way it's like I am able to write to the server but somehow I can't read or download it from the client side.
UPDATE:
I found out that after you've uploaded a file to the server, it can be accessed from the link http://servername:port/doclinks/drawings/filename from any other client side desktop in the browser. However within the Maximo web app, the webpage javascript automatically parse the link as http://servername/f$/MAXIMO/doclinks/drawings/filename -- It returns redundant part and withno port number. Is this returned link configurable through settings or do I have to dig into the JSP?
You need to set up virtual directory mapping on weblogic
http://docs.oracle.com/cd/E11035_01/wls100/webapp/weblogic_xml.html
The files are on the server. You just to correctly map the doclinks root on the file system to make it accessible to the web. You are getting a 404 error because the mapping is wrong.
In c:\maximo\applications\maximo\maximouiweb\webmodule\WEB-INF look for weblogic.xml
Add an entry:
<virtual-directory-mapping>
<local-path>/apps/maximo/</local-path>
<url-pattern>/doclinks/*</url-pattern>
</virtual-directory-mapping>
The entry above sits between <weblogic-web-app> and </weblogic-web-app>
The above example would mean your doclinks directory on the server is /apps/maximo/doclinks/
You would need to edit weblogic.xml, redeploy your maximo.ear along with your doclinks.
Because you are getting http://servername/f$/MAXIMO/doclinks/drawings/filename, particularly the f$/MAXIMO part, it tells me your F<PATH>\\MAXIMO\\doclinks = http://servername:port/doclinks mapping in your doclinks.properties is not correct. If you are on the Maximo host, via remote desktop or whatever, and you open Windows Explorer, what do you need to put in the address bar to access the attached documents folder? You said it's on F$, but F<PATH> in your properties file will translate into F:, not F$. So, you need to change your doclinks.properties
from: F<PATH>\\MAXIMO\\doclinks = http://servername:port/doclinks
to: \\\\servername\\F$\\MAXIMO\\doclinks = http://servername:port/doclinks
I have developed a web application using ASP.Net MVC 4, then hosted that web application on windows azure (windowsazure.com).
My website is unable to upload image/create file. Should I add some permissions?or how to add these permissions?
Please use Edit question to provide code snippet.
Give the code you've provided, the failing part is most probably the line:
Server.MapPath("~/UploadImages/" + ...);
What you really have to do, is first check whether that folder already exists:
string targetFolder = Server.MapPath("~/UploadImages");
if(!System.IO.Directory.Exists(taretfolder))
{
System.IO.Directory.Create(targetFolder);
}
// then copy the file here
The "problem", if one can say it is a problem at all, is that the server does not have this directory created when you try to put file into it. You have to first the directory before trying to copy files.
And by the way, it is not an "Azure" issue. If you take your code as is and upload it to a hoster (without manually creating or coping the UploadImages folder) you will encounter the very same issues.
I have an ASP.NET MVC website that works in tandem with a Windows Service that processes file uploads. For easy maintenance of the site, I'd like the log file for the Windows Service to be accessible (to me, only) via the website, so that I can hit http://myserver/logs/myservice to view the contents of the log file. How can I do that?
At a guess, I could either have the service write its log file in a "Logs" folder at the top level of the site, or I could leave it where it is and set up a virtual directory to point to it. Which of these is better - or is there another, better way?
Wherever the file is stored, I can see that there's going to be another problem. I tried out the first option (Logs folder in my website), but when I try to access the file via HTTP I get an error:
The process cannot access the file 'foo' because it is being used by another process.
Now, I know from experience that my service keeps the file locked for writing while it's running, but that I can still open the file in Notepad to view the current contents. (I'm surprised that IIS insists on write access, if that's what's happening).
How can I get around that? Do I really have to write a handler to read the file and serve it to the browser myself? Or can I fix this with configuration or somesuch?
PS. I'm using IIS7 if that helps.
Unfortunately I'm afraid you'll have to write a handler that will open the file, and return it to the client.
I've written an IIS Manager extension that displays server log files, and what I've noticed that even the simple
System.IO.File.OpenRead("")
can still run in the same problem, and return the same error.. It was kind of confusing.
In the end I used
System.IO.File.Open("", FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
and I could easily open the file while the server was writing logs to it :)
I think the virtual directory is an "okay" solution, if you add the directory (application) with READ ONLY rights + perhaps "BROWSE directory" too (so you can see the folder contents rendered by the IIS).
(But once you do that, you should consider that you also anonymous access to that folder - unless you enable authentication, so watch out for "secret" contents of the logfiles that you might expose? just a thought.)
Another approach, I prefer myself, is to make a MVC/ASP.NET page that does the lookup in the folder by normal code, so that you 100% can filter whatever data is shown in the HTML.
You can open the files as TextStream's and in Read Only mode.
If it's a problem to gain access to the logfolder, I would use the virtual directory with READ ONLY access and then program something that renders the logfiles as HTML on my screen and with my detail levels. Perhaps even add some sort of "login" first. But it all depends on your security levels and contents of logfiles.
is this meaningfull to you? if not, please explain more, as I've been through this thought a few times already for similar situations.