I am writing a macro; but at some point i need help.
What I want is to download pages in bulk for offline viewing.
I used IDM; but the pages seem to be missing when viewing the files after downloading.
I also need to be able to set their names automatically so that I can automatically replace them with another macro; so I'm writing a macro outside of IDM or similar program.
So I wanted to download with CTRL+S + Save type: Webpage, Single File method.
I did everything; I just want it to automatically increment the record name by 1 by 1.
Sample;
document.selection.StartOfDocument();
nLines = document.GetLines();
for( y = 1; y < nLines; ++y ) {
str = document.GetLine( y );
if( str.length != 0 ) {
document.selection.OpenLink();
Sleep( 5000 );
shell.SendKeys( "^s" );
Sleep( 500 );
//////////////////////////////////////////////////////////////////
How can I add the command I want at this point?
I can get it to write the number '1'; but it has to increment by 1 each time.
How can I do this code?
//////////////////////////////////////////////////////////////////
shell.SendKeys( "~" );
Sleep( 500 );
shell.SendKeys( "%{F4}" );
Sleep( 500 );
}
document.selection.LineDown(false,1);
}
alert( "Finished Saving" );
After this command, it will write 1 for the record name.
And it will increment it by 1 each time.
URL - (The filename will be saved as 1.)
URL - (The filename will be saved as 2.)
URL - (The filename will be saved as 3.)
...
URL - (The filename will be saved as 365.)
P.S: To put it another way; When saving the connections, the registration name should be saved as the "Line Number" where that connection is located.
That way, after all the files have been downloaded, I'll be able to use my other macro to rename them with the URL they belong to.
Finally; If you know of another way to do what I want, I'd love to hear it.
Thank you in advance and have a nice day.
y is the line number. So, replace the comment lines in your sample with:
shell.SendKeys( y );
Related
I am not a coder but trying to turn ThunderSTORM's batch process into an automated one where I have a single input folder and a single output folder.
input_directory = newArray("C:\\Users\\me\\Desktop\\Images");
output_directory = ("C:\\Users\\me\\Desktop\\Results");
for(i = 0; i < input_directory.length; i++) {
open(input_directory[i]);
originalName = getTitle();
originalNameWithoutExt = replace( originalName , ".tif" , "" );
fileName = originalNameWithoutExt;
run("Run analysis", "filter=[Wavelet filter (B-Spline)] scale=2.0 order=3 detector "+
"detector=[Local maximum] connectivity=8-neighbourhood threshold=std(Wave.F1) "+
"estimator=[PSF: Integrated Gaussian] sigma=1.6 method=[Weighted Least squares] fitradius=3 mfaenabled=false "+
"renderer=[Averaged shifted histograms] magnification=5.0 colorizez=true shifts=2 "+
"repaint=50 threed=false");
saveAs(fileName+"_Results", output_directory);
}
This probably looks like a huge mess but the original batch file used arrays and I can't figure out what that is. Taking it out brakes it so I left it in. The main issues I have revolve around the saveAs part not working.
Using run("Export Results") works but I need to manually pick a location and file name. I tried to set this up to take the file name and rename it to the generic image name so it can save a CSV using that name.
Any help pointing out why I'm a moron? I would also love to only open one file at a time (this opens them all) and close it when the analysis is complete. But I will settle for that happening on a different day if I can just manage to save the damn CSV automatically.
For the most part, I broke the code a whole bunch of times but it's in a working condition like this.
I appreciate any and all help. Thank you!
if(Total_sell_pos() == 0 && Total_buy_pos() == 0) {
double previous_balance = AccountBalance(); //usd1000
}
if (AccountEquity() > previous_balance + (previous_balance *0.05)){ //usd1000 + 50 = usd1050
CloseSellOrders();
CloseBuyOrders();
Delete_Pendings();
}
if Equity more than usd1050 then delete pending and orders.
But why when run the code, it keep delete pending and orders immediately even when Equity is less than previous balance?
The following code is the problem, and I replace it :
AccountEquity() > previous_balance + (previous_balance *0.05)
with
AccountEquity() > 1050
then only it works. I did try to check the value :
double check_value = previous_balance + (previous_balance *0.05);
printf (check_value); //1050
May I know why I cannot use the following code?
AccountEquity() > previous_balance + (previous_balance *0.05)
Q: How to store AccountBalance() into a variable?
Let's start with the variable - declare it:
double aPreviousBALANCE;
The scope-of-declaration is driven by the enclosing code-block boundaries. MQL4/5 can declare a variable on the "global"-scope, that may become visible from inside other code-blocks, but if any such has a variable name identical to the "global"-scope defined one, the locally declared ( explicitly in the code, or introduced from the function-parameters' declaration in the call-signature specification ) will "shade-off" the access to the variable declared on the "global"-scope. This you have to check in the original code and MQL4/5-IDE may warn you about such collision(s) during the compilation ( ref. Compiler Warning Messages ).
Let's store in it the actual state, we'll have more steps here:
RefreshRates(); // Force a state-update
aPreviousBALANCE = AccountInfoDouble( ACCOUNT_BALANCE ); // Store an updated value
Q: May I know why I cannot use the following code?
Well, any language, MQL4/5 not being an exception, has some order of execution of mathematical operators. MQL4 need not and does not have the warranty about using the same one as any other language we may have had some prior experience. So, always be rather explicit in this a specify all ordering via explicit parentheses, this will save you any further "surprises" when the language parser / compiler will suddenly change the priority of operators and sudden nightmares will appear. Not worth a single such shock to ever happen:
if ( ( ( a * b ) + c ) < fun() ) // is EXPLICIT and a way safer, than
if ( a * b + c < fun() ) // is DEPENDENT on not having {now|in future}
// a binary boolean (<)-operator
// a higher priority than (+)-op
so, rather be always explicit and you remain on the safer side.
Finally, test:
RefreshRates(); // Force a state-update
if ( ( aPreviousBALANCE * 1.05 ) < AccountInfoDouble( ACCOUNT_EQUITY ) )
{
...
}
Also check, how are your settings pre-set from the Broker-side - they run a Support-Line for you to ask about their settings:
Equity calculation depends on trading server settings.
Print( "Profit calculation mode for SYMBOL[ ",
Symbol(),
" ] is ",
MarketInfo( Symbol(), MODE_PROFITCALCMODE ),
" { 0: mode-FOREX, 1: mode-CFD, 2: mode-FUTURES }."
);
And where is my AccountBalance() function?
Recent Terminal Builds use a set of new types of calls to:
AccountInfo{Integer|
Double|
String}( <anEnumDrivenItemIDENTIFIER>
)
SymbolInfo{Integer|
Double|
String}( <aSymbolNAME>,
<anEnumDrivenItemIDENTIFIER>
)
to name just a few, so re-read the documentation to adopt the most recent changes. Always. ALAP when your Terminal has got a new Build updated ( might be seen when loading a new version of Help files for the MQL4-IDE and/or Terminal ).
Well, this happens. MQL4 evolves and some features we were used to for ages cease to exist, start to suddenly yield inaccurate or indefinite result or change its behaviour ( ol' MQL4-ers still remember the day, when string data type simply ceased to be a string in silence and suddenly started to become a struct. Ok, it was mentioned somewhere deep inside an almost unrelated page of an updated Help-file, yet the code-crashes were painful and long to debug, analyze and re-factor )
I have a solution with 2 projects in, one called "admin", the other say "work" (an Umbraco instance)
- Work has an images folder which contains the images for the site
banners/thumnails etc.
- Admin allows an admin user to add new stories, with images, using TinyMCE
and the fileman plug in.
So in IIS I created a virtual folder in Admin which points to the images folder in work, however when I try and browse the folder in fileman it repeats lots of sub directoriers and doesnt display any images.And I cannot upload any images either, it just gives me an error.
The Files_Root entry is as follows within the conf.json file.
"FILES_ROOT": ".//images//",
So how do I get this virtual folder to work with fileman?
I stumbled upon this problem as well - when the FILES_ROOT points to a virtual folder in IIS, the Fileman component chokes when trying to retrieve the list of files, and when uploading, and maybe some other places. The requests don't pass the folder location properly when it references a virtual directory. If you put a network sniffer on it, you'll see the requests sent to fileman/asp_net/main.ashx and the responses that come back with the error "The given path's format is not supported."
I've reported the bug to the author via the website, but I have also discovered that if you replace the virtual directory with a SYMLINK, everything seems to work.
If you have IIS access, you probably have command line access to create the symlink, which can be done as:
mklink /j "{virtual location within your website}" "{physical location}"
Both locations should be full paths, including drive letters and the "virtual" location should be in your website root.
So far, I have not seen an problems referencing files this way instead of with a virtual directory, other than that my site backups started including the files in the symlink, since the O/S sees that as a physical folder within the site now.
I hope this helps!!
Probably not good for every situation (maybe not even for the user that asked the question), but figured I would share in case it might help someone trying to get virtual directories mapped to network shares working. I needed to modify the ListDirTree function in file fileman/asp_net/main.ashx
protected void ListDirTree(string type)
{
string filesRoot = GetFilesRoot();
DirectoryInfo d = new DirectoryInfo( filesRoot );
if ( !d.Exists )
throw new Exception( "Invalid files root directory. Check your configuration: " + filesRoot );
ArrayList dirs = ListDirs( d.FullName );
dirs.Insert( 0, d.FullName );
string localPath = _context.Server.MapPath( "~/" );
bool isLocal = filesRoot.Contains( ":" );
_r.Write( "[" );
for ( int i = 0; i < dirs.Count; i++ )
{
string dir = (string)dirs[i];
string lPath;
//If it is a local path, leave it as it was
if ( isLocal )
lPath = dir.Replace( localPath, "" ).Replace( "\\", "/" );
else
//Otherwise probably a virtual directory, put the original files_root location back
lPath = dir.Replace( filesRoot, GetSetting( "FILES_ROOT" ) ).Replace( "\\", "/" );
_r.Write( "{\"p\":\"/" + lPath + "\",\"f\":\"" + GetFiles( dir, type ).Count.ToString() + "\",\"d\":\"" + Directory.GetDirectories( dir ).Length.ToString() + "\"}" );
if ( i < dirs.Count - 1 )
_r.Write( "," );
}
_r.Write( "]" );
}
In contenteditable regions, if you paste an element with a URL attribute, in some browsers it converts the URL from relative to absolute.
I've read through some bug reports that claim it's "fixed" in the latest release, but it's not.
I threw together this fiddle to demonstrate: Hurray for Demos!
It's there, it's ugly, and I'm wondering what is the best way to fix it.
The 1st idea that comes to mind is onpaste, find all anchors in the current node and parse it with regex. Not ideal I suppose, but it might be effective.
???
???
I really wish they'd just leave things alone and not create so many browser related issues with contenteditable, but I guess that would make it too easy.
Any thoughts on the best way to address this?
CKEditor, before letting browser break the data, copies all src, name and href attributes to data-cke-saved-src|href attributes. Unfortunately, since data is a string, it has to be done by regexp. You can find the code here: /core/htmldataprocessor.js#L772-L783.
var protectElementRegex = /<(a|area|img|input|source)\b([^>]*)>/gi,
// Be greedy while looking for protected attributes. This will let us avoid an unfortunate
// situation when "nested attributes", which may appear valid, are also protected.
// I.e. if we consider the following HTML:
//
// <img data-x="<a href="X"" />
//
// then the "non-greedy match" returns:
//
// 'href' => '"X"' // It's wrong! Href is not an attribute of <img>.
//
// while greedy match returns:
//
// 'data-x' => '<a href="X"'
//
// which, can be easily filtered out (#11508).
protectAttributeRegex = /([\w-]+)\s*=\s*(?:(?:"[^"]*")|(?:'[^']*')|(?:[^ "'>]+))/gi,
protectAttributeNameRegex = /^(href|src|name)$/i;
function protectAttributes( html ) {
return html.replace( protectElementRegex, function( element, tag, attributes ) {
return '<' + tag + attributes.replace( protectAttributeRegex, function( fullAttr, attrName ) {
// Avoid corrupting the inline event attributes (#7243).
// We should not rewrite the existed protected attributes, e.g. clipboard content from editor. (#5218)
if ( protectAttributeNameRegex.test( attrName ) && attributes.indexOf( 'data-cke-saved-' + attrName ) == -1 )
return ' data-cke-saved-' + fullAttr + ' data-cke-' + CKEDITOR.rnd + '-' + fullAttr;
return fullAttr;
} ) + '>';
} );
}
Then, while processing HTML taken from editable element, data-cke-saved-* attributes override the original ones.
This looks like a browser bug that's not specific to contenteditable: https://bugzilla.mozilla.org/show_bug.cgi?id=805359
That issue was opened 10 years ago and last updated 6 years ago. Yet it's still open.
You can see the bug here on StackOverflow. Inspecting any SO link shows that the href value is a relative URL. Copying it and pasting it as HTML has the relative link rewritten into an absolute URL.
Example:
This simple method for caching dynamic content uses register_shutdown_function() to push the output buffer to a file on disk after exiting the script. However, I'm using PHP-FPM, with which this doesn't work; a 5-second sleep added to the function indeed causes a 5-second delay in executing the script from the browser. A commenter in the PHP docs notes that there's a special function for PHP-FPM users, namely fastcgi_finish_request(). There's not much documentation for this particular function, however.
The point of fastcgi_finish_request() seems to be to flush all data and proceed with other tasks, but what I want to achieve, as would normally work with register_shutdown_function(), is basically to put the contents of the output buffer into a file without the user having to wait for this to finish.
Is there any way to achieve this under PHP-FPM, with fastcgi_finish_request() or another function?
$timeout = 3600; // cache time-out
$file = '/home/example.com/public_html/cache/' . md5($_SERVER['REQUEST_URI']); // unique id for this page
if (file_exists($file) && (filemtime($file) + $timeout) > time()) {
readfile($file);
exit();
} else {
ob_start();
register_shutdown_function(function () use ($file) {
// sleep(5);
$content = ob_get_flush();
file_put_contents($file, $content);
});
}
Yes, it's possible to use fastcgi_finish_request for that. You can save this file and see that it works:
<?php
$timeout = 3600; // cache time-out
$file = '/home/galymzhan/www/ps/' . md5($_SERVER['REQUEST_URI']); // unique id for this page
if (file_exists($file) && (filemtime($file) + $timeout) > time()) {
echo "Got this from cache<br>";
readfile($file);
exit();
} else {
ob_start();
echo "Content to be cached<br>";
$content = ob_get_flush();
fastcgi_finish_request();
// sleep(5);
file_put_contents($file, $content);
}
Even if you uncomment the line with sleep(5), you'll see that page still opens instantly because fastcgi_finish_request sends data back to browser and then proceeds with whatever code is written after it.
First of all,
If you cache the dynamic content this way, you're doing it wrong. Surely it can be used this way and it will be able to work, but its totally paralyzed by the approach itself.
if you want to efficiently cache and handle content, create one class and wrap all caching function into.
Yes, you can use fast_cgi_finish_request() like a register_shutdown(). The only difference is that, fast_cgi_finish_request() will send an output(if any) and WILL NOT TERMINATE the script, while an callback of register_shutdown() will be invoked on script termination.
This is an old question, but I don't think any of the answers correctly answer the problem.
As I understand it, the problem is that none of the output gets sent to the client until the ob_get_flush() call that happens during the shutdown function.
To fix that issue, you need to pass a function and a chunk size to ob_start() in order to handle the output in chunks.
Something like this for your else clause:
$content = '';
$write_buffer_func = function($buffer, $phase) use ($file, &$content) {
$content .= $buffer;
if ($phase & PHP_OUTPUT_HANDLER_FINAL) {
file_put_contents($file, $content);
}
return $buffer;
};
ob_start($write_buffer_func, 1024);