we've got to implement a document repository with Alfresco and is required that links to external documents (they could be an Url or a network uri like \SERVER\doc.pdf) should be available as regular alfresco-stored documents (tag, categorize and comment them).
I'm wondering whether it's possible to do that with Alfresco or we are going to develop that functionality ourselves.
You should be able to achieve this with Alfresco Bulk File System Import, which will allow you to load all the files from where they reside, into the Alfresco content store. In addition to that, you can use the Alfresco Content Stores to manage multiple content stores within Alfresco.
You can create a simple HTML document with a javascript redirect inside like this:
<script type="text/javascript">// <![CDATA[
window.location.replace("http://stackoverflow.com");
// ]]></script>
Save it as something.html and upload in Alfresco.
Related
I'd like to give our business team the ability to edit certain pages and content themselves via a CMS solution in our grails application, and Weceem plugin seems like a good choice.
The potential showstopper I see is that is uses the local server file system for uploaded content, which is no good in a horizontally scaled cloud environment like ours (we run in AWS).
Question is, is it possible to tell Weceem to use the database to store binary/uploaded content, or (better yet) override the content upload handlers to use Amazon S3 instead of the file system (we already have code that uploads to S3 in our main app, so the question is just how to hook into Weceem)
I assume that in such situation its possible to create your own content type (domain class) in your app that stores binary uploaded content. This class should be a subclass of org.weceem.content.WcmContent class. In weceem you can check a small example for storing such content, see org.weceem.files.WcmContentFileDB class Also, here there is an information how to extend plugin with custom content type. I hope the information can be helpful.
As for uploading: in Weceem we use CKeditor plugin for uploading additional files/resources, also org.weceem.files.WcmContentFile is used, it stores files on file system, the files are uploaded using paths provided with org.weceem.services.WcmContentRepositoryService.getUploadPath(...) method. This path is calculated from configuration property that is provided in application config (e.g. 'weceem.upload.dir'). Not sure that you can hook here.
I have some html file that are downloaded post app startup. I would use the browserfield to display these html pages.
The assistance i need is to know;
Is there an internal writable directory where i can store my html files and access from browserfield, at the same time it should not be accessible from file explorer.
How do i read files from here using fileconnection
What should i write in browserField.requestContent("????????");
The html files need to be not accesible as they have certain logic which i would not like to expose to the end user.
Many Thanks in advance,
Godwin
Question has been answered on the BB forum here:
http://supportforums.blackberry.com/t5/Java-Development/local-storage-writable-that-is-not-accessible-from-file-explorer/td-p/2806751
In summary, the answer given (and accepted) was that the downloaded files needed to be stored on the local file system (SD Card, internal storage). Securing these files could be done with encryption.
Am using JSF and primefaces to develop web application.I want to open existing files on client machines using dialog box which prompts the user to select a path and the corresponding file. Please suggest a component which can be used.
While I doubt the feasibility of your intentions; accessing content directly on a client's machine (some security implications there), a combination of <p:media/> and <p:lightBox/> will work for you. There are file type restrictions imposed by primefaces though (multimedia files and pdf only) The <p:media/> can be embedded in the <p:lightBox/> like so :
<p:lightBox>
<p:media value="{yourBean.filePath}" width="100%" height="300px">
</p:lightBox>
Like I said, I doubt the feasibility of directly streaming content from a client's local filesystem. How do you intend to use the path c:\Users\john doe\my documents\my books\book.pdf on a user's local system within your own web application, without first uploading the file to your own webserver? With image files, you might have some success loading the file into memory and streaming the file directly from RAM using <p:dynaImage/>...consider the scalability of this option too for a high traffic application
I've got a working system to upload to the default root using resumable uploads and I've been able to make the meta data in a collection but I need to be able to do this type of upload straight to a collection or sub collection.
My main problem is the url, there's no defined example of a straight upload path to a collection?
I assumed that it would be like:
https://docs.google.com/feeds/upload/create-session/default/private/full/folder%3A[folder_id]/contents
considering that the typical upload path is similar except without the upload/create-session part?
In a folder entry, look for a link like this:
<link rel="http://schemas.google.com/g/2005#resumable-create-media"
type="application/atom+xml"
href="https://docs.google.com/feeds/upload/create-session/default/private/full"/>
The href is what you need to send to to create the file in that folder.
p.s. Have you considered using the new Drive API? It has this funcitonality and is considerably better to use.
Turns out I was originally correct and the URL I provided was correct for uploading but the function I created for generating the URL was missing out the / between feeds and upload
I have a website that shows galleries. Users can upload their own content from the web (by entering a URL) or by uploading a picture from their computer.
I am storing the URL in the database which works fine for the first use case but I need to figure out where to store the actual images if a user does a upload from their computer.
Is there any recommendation here or best practice on where I should store these?
Should I save them in the appdata or content folders? Should they not be stored with the website at all because it's user content?
You should NOT store the user uploads anywhere they can be directly accessed by a known URL within your site structure. This is a security risk as users could upload .htm file and .js files. Even a file with the correct extension can contain malicious code that can be executed in the context of your site by an authenticated user allowing server-side or client-side attacks.
See for example http://www.acunetix.com/websitesecurity/upload-forms-threat.htm and What security issues appear when users can upload their own files? which mention some of the issues you need to be aware of before you allow users to upload files and then present them for download within your site.
Don't put the files within your normal web site directory structure
Don't use the original file name the user gave you. You can add a content disposition header with the original file name so they can download it again as the same file name but the path and file name on the server shouldn't be something the user can influence.
Don't trust image files - resize them and offer only the resized version for subsequent download
Don't trust mime types or file extensions, open the file and manipulate it to make sure it's what it claims to be.
Limit the upload size and time.
Depending on the resources you have to implement something like this, it is extremely beneficial to store all this stuff in Amazon S3.
Once you get the upload you simply push it over to Amazon and pop the URL in your database as you're doing with the other images. As mentioned above it would probably be wise to open up the image and resize it before sending it over. This both checks it is actually an image and makes sure you don't accidentally present a full camera resolution image to an end user.
Doing this now will make it much, much easier if you ever have to migrate/failover your site and don't want to sync gigabytes of image assets.
One way is to store the image in a database table with a varbinary field.
Another way would be to store the image in the App_Data folder, and create a subfolder for each user (~/App_Data/[userid]/myImage.png).
For both approaches you'd need to create a separate action method that makes it possible to access the images.
While uploading images you need to verify the content of the file before uploading it. The file extension method is not trustable.
Use magic number method to verify the file content which will be an easy way.
See the stackoverflow post and see the list of magic numbers
One way of saving the file is converting it to binary format and save in our database and next method is using App_Data folder.
The storage option is based on your requirement. See this post also
Set upload limit by setting maxRequestLength property to Web.Config like this, where the size of file is specified in KB
<httpRuntime maxRequestLength="51200" executionTimeout="3600" />
You can save your trusted data just in parallel of htdocs/www folder so that any user can not access that folder. Also you can add .htaccess authentication on your trusted data (for .htaccess you should kept your .htpasswd file in parallel of htdocs/www folder) if you are using apache.