For quick debugging I prefer Zend\Debug\Debug::dump(). The problem: Sometimes I forget to remove old dump statements and they make it into production.
It would be great if Debug::dump() only printed something, when I’m in development mode.
Is there any elegant way to achieve this without having to transform Zend\Debug\Debug into a service? I like the plain and simple static method call. Probably through setting an env var when enabling dev mode?
In your /public/.htaccess add this code. SetEnv APPLICATION_ENV "development"
In your /public/index.php file
/**
* Set global ENV. Used for debugging
*/
if (isset($_SERVER['APPLICATION_ENV']) && $_SERVER["APPLICATION_ENV"] === 'development') {
define("APP_ENV", 'development');
} else {
define("APP_ENV", "production");
}
/**
* Set default php.ini settings.
*
* Below lines includes security|error fixes
*/
/**
* Handle reporting level
*/
error_reporting((APP_ENV === 'development' ? E_ALL : 0));
/**
* Log errors into a file
*/
ini_set("log_errors", (APP_ENV === 'development'));
/**
* Display of all other errors
*/
ini_set("display_errors", APP_ENV === 'development');
/**
* Display of all startup errors
*/
ini_set("display_startup_errors", APP_ENV === 'development');
/**
* Catch an error message emitted from PHP
*/
ini_set("track_errors", APP_ENV === 'development');
/**
* Avoid serving non .php files as .php files
*/
ini_set('cgi.fix_pathinfo', 0);
/**
* Helps mitigate xss
*/
ini_set('session.cookie_httponly', 1);
/**
* Prevents session fixation
*/
ini_set('session.use_only_cookies', 1);
/**
* Fixes files and server encoding
*/
mb_internal_encoding('UTF-8');
/**
* Some server configurations are missing a date timezone
*/
if (ini_get('date.timezone') == '') {
date_default_timezone_set('UTC');
}
When your site goes public, just change the .htaccess env variable to production and all debugging options will disappear. That's the way how i disable such debug options and modules and it works fine.
Related
I have got shared library loaded with #Library('libName') annotation in jenkinsfiles. How to get knowledge (in the code of the pipeline) which version has been loaded? How to distinguish if the library has been loaded using:
#Library('libName'), #Library('libName#master') or #Library('libName#superBranch')?
Regards, Dawid.
The following works for me on Jenkins 2.318 and returns the branch name, at least inside the library:
env."library.LIBNAME.version"
Where LIBNAME is the name of your library, so in your example:
echo "library version: ${env."library.libName.version"}"
Would print e.g. master or superBranch.
You can do something similar below.
#Library('MyLibrary#test') _
node('master') {
dir( "${WORKSPACE}#libs/MyLibrary") {
//This is the path library.
//Run any command to get branch name
}
}
Important point: If you run this job concurrently, library directory names will be something like this MyLibrary#2 depending on the build number.
Hope this helps.
So this is not easy, but this is what my project does. We use git tags but its essentially the same concept. However because we use a convention, we can differentiate. (Jenkins shared checkouts check the #'whatever' as a branch first and then a tag revision).
This is under the hood stuff, so no guarantee it will stay the same during jenkins development.
The wrapper function essentially returns true/false if its been locked to a version. We return this whenever its v.x.x.x. You would probably return whenever its not the default branch (whatever you have set in jenkins)
/**
* Wrapper for checking if loaded jenkins shared libs are pointing to a git branch or tag
*
* #return Boolean
*/
Boolean isLockedSharedLibraryRevision() {
List<Action> actions = $build().getActions(BuildData.class)
return checkSharedLibraryBranches(actions)
}
/**
* Check if shared libraries are locked to specific git tag (commit hash)
* Return True if running on a particular revision (Git Tag)
* Return False if running on HEAD of a branch (develop by default)
*
* Assumption is that Git Tag follows format vx.x.x (e.g. v1.0.22)
*
* #param actions (List of jenkins actions thatmatch BuildData.class)
* #return Boolean
*/
Boolean checkSharedLibraryBranches(List<Action> actions) {
Boolean isLockedSharedLibraryRevision = false
Boolean jenkinsSharedFound = false
if (actions == null || actions.size() == 0) {
throw new IllegalArgumentException("Build actions must be provided")
}
// Check each BuildData Action returned for one containing the jenkins-shared revisions
actions.each { action ->
HashSet remoteURLs = action.getRemoteUrls()
remoteURLs.each { url ->
if ( url.contains('<insert-your-repo-name>') ) {
jenkinsSharedFound = true
Pattern versionRegex = ~/^v\d+\.\d+\.\d+$/
/**
* When jenkins-shared is found evaluate revision branch/tag name.
* getLastBuiltRevision() returns the current executions build. This was functionally tested.
* If a newer build runs and completes before the current job, the value is not changed.
* i.e. Build 303 starts and is in progress, build 304 starts and finishes.
* Build 303 calls getLastBuiltRevision() which returns job 303 (not 304)
*/
Revision revision = action.getLastBuiltRevision()
/**
* This is always a collection of 1, even when multiple tags exist against the same sha1 in git
* It is always the tag/branch your looking at and doesn't report any extras...
* Despite this we loop to be safe
*/
Collection<Branch> branches = revision.getBranches()
branches.each { branch ->
String name = branch.getName()
if (name ==~ versionRegex) {
println "INFO: Jenkins-shared locked to version ${name}"
isLockedSharedLibraryRevision = true
}
}
}
}
}
if (!jenkinsSharedFound) {
throw new IllegalArgumentException("None of the related build actions have a remoteURL pointing to Jenkins Shared, aborting")
}
println "INFO: isLockedSharedLibraryRevision == ${isLockedSharedLibraryRevision}"
return isLockedSharedLibraryRevision
}
In a Jenkinsfile, if I have a Jenkins shared library installed under the alias my-awesome-lib, I can include it using the syntax:
#Library('my-awesome-lib')
import ...
But how can I refer to the library from the Jenkins script console?
You can refer the library object from script console like this:
// get Jenkins instance
Jenkins jenkins = Jenkins.getInstance()
// get Jenkins Global Libraries
def globalLibraries = jenkins.getDescriptor("org.jenkinsci.plugins.workflow.libs.GlobalLibraries")
globalLibraries.getLibraries()
but using the shared libraries code will not be simple and might be even impossible.
In continue to the code above, let's say you do:
def lib = globalLibraries[0]
get the retriever:
def ret = lib.getRetriever()
then you need to retrieve the source code, but in order to call retrieve(), you need a few object that you don't have in the script console:
/**
* Obtains library sources.
* #param name the {#link LibraryConfiguration#getName}
* #param version the version of the library, such as from {#link LibraryConfiguration#getDefaultVersion} or an override
* #param target a directory in which to check out sources; should create {#code src/**}{#code /*.groovy} and/or {#code vars/*.groovy}, and optionally also {#code resources/}
* #param run a build which will use the library
* #param listener a way to report progress
* #throws Exception if there is any problem (use {#link AbortException} for user errors)
*/
public abstract void retrieve(#Nonnull String name, #Nonnull String version, #Nonnull FilePath target, #Nonnull Run<?,?> run, #Nonnull TaskListener listener) throws Exception;
so there might be a hacky way to do so, but IMO it doesn't worth it.
I suggest you to deploy the following Jenkins pipeline to some repo.
Every time you use it, it will show you the last executed code.
If you have libraries that load automatically, you have a great playground to fiddle with them.
pipeline {
agent any
options {
skipDefaultCheckout true // Access to this file not required
timestamps()
}
parameters {
// Set as default value the current value, which means that every time you open "Run with parameters", you have the last code you executed.
text(name: 'SCRIPT', defaultValue: params.SCRIPT,
description: 'Groovy script')
}
stages {
stage("main") {
steps {
script {
writeFile file: 'script.groovy', text: params.SCRIPT
def retvalue = load 'script.groovy'
if (retvalue != null)
// disable next line or install this cool plugin
currentBuild.description = (retvalue as String).split('\n')[0].take(40)
echo "Return value: '${retvalue}'"
}
} // steps
} // stage
} // stages
post {
cleanup {
script {
deleteDir()
}
}
}
} // pipeline
I have a website with a form to upload files. I want to automatically sign in and upload image files once a changes are scene on my local folder on my computer. Can any guidance be provided in the matter.
As per my understanding, you can write a task for this purpose, which can run lets say every hour and check if any changes are made in directory then upload these files on you app.
I don't know what kind of system you are working on but you could do something like this. If you are on a linux system you could use the watch command to track the activity of the directory of choice. The what you could do is use something like Mechanize in a ruby script that gets triggered by the watch command that will then go and submit the form and upload the file for you by selecting the file with the latest creation date.
I realize that it says ruby on rails in the post, but this answer is just as legitimate as writing a solution in Ruby (and a bit easier/faster)
Using Qt C++ to do this, then you could do something like this:
(untested, you'll have to make adjustments for your exact situation)
Overview of Code:
Create a program that loops on a Timer every 20 minutes and goes through the entire directory that you specify with WATCH_DIR, and if it finds any files in that directory which were modified in between the time that the loop last ran (or after the program starts but before the first loop is run), then it uploads that exact file to whatever URL you specify with UPLOAD_URL
Then create a file called AutoUploader.pro and a file called main.cpp
AutoUploader.pro
QT += core network
QT -= gui
CONFIG += c++11
TARGET = AutoUploader
CONFIG += console
CONFIG -= app_bundle
TEMPLATE = app
SOURCES += main.cpp
main.cpp
#include <QtCore/QCoreApplication>
#include <QtCore/qglobal.h>
#include <QDir>
#include <QDirIterator>
#include <QNetworkAccessManager>
#include <QTimer>
#include <QByteArray>
#include <QHash>
#define WATCH_DIR "/home/lenny/images"
#define UPLOAD_URL "http://127.0.0.1/upload.php"
int main(int argc, char *argv[])
{
QCoreApplication a(argc, argv);
MainLoop loop(WATCH_DIR);
return a.exec();
}
class MainLoop : public QObject {
Q_OBJECT
public:
MainLoop(QString _watch_directory = qApp->applicationDirPath()) {
watch_directory = _watch_directory;
// the ACTION="" part of the upload form
website_upload_url = UPLOAD_URL;
/* 20 minutes
20 * 60 * 1000 = num of milliseconds that makes up
20 mins = 1200000 ms */
QTimer::singleShot(1200000, this, SLOT(check_for_updates()));
/* this will stop any file modified before you ran this program from
being uploaded so it wont upload all of the files at runtime */
program_start_time = QDateTime::currentDateTime();
}
QDateTime program_start_time;
QString watch_directory;
QString website_upload_url;
// hash table to store all of the last modified times for each file
QHash<QString, QDateTime> last_modified_time;
~MainLoop() { qApp->exit(); }
public slots:
void check_for_updates() {
QDirIterator it(QDir(watch_directory));
/* loop through all file in directory */
while (it.hasNext()) {
QFileInfo info(it.next());
/* check to see if the files modified time is ahead of
program_start_time */
if (info.lastModified.msecsTo(program_start_time) < 1) {
upload_file(info.absoluteFilePath());
}
}
/* set program_start_time to the current time to catch stuff next
time around and then start a timer to repeat this command in
20 minutes */
program_start_time = QDateTime::currentDateTime();
QTimer::singleShot(1200000, this, SLOT(check_for_updates()));
}
/* upload file code came from
https://forum.qt.io/topic/11086/solved-qnetworkaccessmanager-uploading-files/2
*/
void upload_file(QString filename) {
QNetworkAccessManager *am = new QNetworkAccessManager(this);
QString path(filename);
// defined with UPLOAD_URL
QNetworkRequest request(QUrl(website_upload_url));
QString bound="margin"; //name of the boundary
//according to rfc 1867 we need to put this string here:
QByteArray data(QString("--" + bound + "\r\n").toAscii());
data.append("Content-Disposition: form-data; name=\"action\"\r\n\r\n");
data.append("upload.php\r\n");
data.append("--" + bound + "\r\n"); //according to rfc 1867
data.append(QString("Content-Disposition: form-data; name=\"uploaded\"; filename=\"%1\"\r\n").arg(QFileInfo(filename).fileName()));
data.append(QString("Content-Type: image/%1\r\n\r\n").arg(QFileInfo(filename).suffix())); //data type
QFile file(path);
if (!file.open(QIODevice::ReadOnly))
return;
data.append(file.readAll()); //let's read the file
data.append("\r\n");
data.append("--" + bound + "--\r\n");
request.setRawHeader(QString("Content-Type").toAscii(),QString("multipart/form-data; boundary=" + bound).toAscii());
request.setRawHeader(QString("Content-Length").toAscii(), QString::number(data.length()).toAscii());
this->reply = am->post(request,data);
connect(this->reply, SIGNAL(finished()), this, SLOT(replyFinished()));
}
void replyFinished() {
/* perform some code here whenever a download finishes */
}
};
Before running this program, make sure to read through it completely and make the necessary changes by reading the comments and the post -- also you may have to install the qt framework depending on your platform
Anyways, the final step is to run qmake to create the project makefile and finally, make to build the binary.
Obviously the last steps are different depending on what system you are using.
... This program will continue to run... essentially forever until you close it.... uploading changed files every 20 minutes
Hope this helps...
In our Grails web applications, we'd like to use external configuration files so that we can change the configuration without releasing a new version. We'd also like these files to be outside of the application directory so that they stay unchanged during continuous integration.
The last thing we need to do is to make sure the external configuration files exist. If they don't, then we'd like to create them, fill them with predefined content (production environment defaults) and then use them as if they existed before. This allows any administrator to change settings of the application without detailed knowledge of the options actually available.
For this purpose, there's a couple of files within web-app/WEB-INF/conf ready to be copied to the external configuration location upon the first run of the application.
So far so good. But we need to do this before the application is initialized so that production-related modifications to data sources definitions are taken into account.
I can do the copy-and-load operation inside the Config.groovy file, but I don't know the absolute location of the WEB-INF/conf directory at the moment.
How can I get the location during this early phase of initialization? Is there any other solution to the problem?
There is a best practice for this.
In general, never write to the folder where the application is deployed. You have no control over it. The next rollout will remove everything you wrote there.
Instead, leverage the builtin configuration capabilities the real pro's use (Spring and/or JPA).
JNDI is the norm for looking up resources like databases, files and URL's.
Operations will have to configure JNDI, but they appreciate the attention.
They also need an initial set of configuration files, and be prepared to make changes at times as required by the development team.
As always, all configuration files should be in your source code repo.
I finally managed to solve this myself by using the Java's ability to locate resources placed on the classpath.
I took the .groovy files later to be copied outside, placed them into the grails-app/conf directory (which is on the classpath) and appended a suffix to their name so that they wouldn't get compiled upon packaging the application. So now I have *Config.groovy files containing configuration defaults (for all environments) and *Config.groovy.production files containing defaults for production environment (overriding the precompiled defaults).
Now - Config.groovy starts like this:
grails.config.defaults.locations = [ EmailConfig, AccessConfig, LogConfig, SecurityConfig ]
environments {
production {
grails.config.locations = ConfigUtils.getExternalConfigFiles(
'.production',
"${userHome}${File.separator}.config${File.separator}${appName}",
'AccessConfig.groovy',
'Config.groovy',
'DataSource.groovy',
'EmailConfig.groovy',
'LogConfig.groovy',
'SecurityConfig.groovy'
)
}
}
Then the ConfigUtils class:
public class ConfigUtils {
// Log4j may not be initialized yet
private static final Logger LOG = Logger.getGlobal()
public static def getExternalConfigFiles(final String defaultSuffix, final String externalConfigFilesLocation, final String... externalConfigFiles) {
final def externalConfigFilesDir = new File(externalConfigFilesLocation)
LOG.info "Loading configuration from ${externalConfigFilesDir}"
if (!externalConfigFilesDir.exists()) {
LOG.warning "${externalConfigFilesDir} not found. Creating..."
try {
externalConfigFilesDir.mkdirs()
} catch (e) {
LOG.severe "Failed to create external configuration storage. Default configuration will be used."
e.printStackTrace()
return []
}
}
final def cl = ConfigUtils.class.getClassLoader()
def result = []
externalConfigFiles.each {
final def file = new File(externalConfigFilesDir, it)
if (file.exists()) {
result << file.toURI().toURL()
return
}
final def error = false
final def defaultFileURL = cl.getResource(it + defaultSuffix)
final def defaultFile
if (defaultFileURL) {
defaultFile = new File(defaultFileURL.toURI())
error = !defaultFile.exists();
} else {
error = true
}
if (error) {
LOG.severe "Neither of ${file} or ${defaultFile} exists. Skipping..."
return
}
LOG.warning "${file} does not exist. Copying ${defaultFile} -> ${file}..."
try {
FileUtils.copyFile(defaultFile, file)
} catch (e) {
LOG.severe "Couldn't copy ${defaultFile} -> ${file}. Skipping..."
e.printStackTrace()
return
}
result << file.toURI().toURL()
}
return result
}
}
I would like to print a list of all environment variables and their values. I searched the Stackoverflow and the following questions come close but don't answer me:
How to discover what is available in lua environment? (it's about Lua environment not the system environment variables)
Print all local variables accessible to the current scope in Lua (again about _G not the os environment variables)
http://www.lua.org/manual/5.1/manual.html#pdf-os.getenv (this is a good function but I have to know the name of the environment variable in order to call it)
Unlike C, Lua doesn't have envp** parameter that's passed to main() so I couldn't find a way to get a list of all environment variables. Does anybody know how I can get the list of the name and value of all environment variables?
Standard Lua functions are based on C-standard functions, and there is no C-standard function to get all the environment variables. Therefore, there is no Lua standard function to do it either.
You will have to use a module like luaex, which provides this functionality.
This code was extracted from an old POSIX binding.
static int Pgetenv(lua_State *L) /** getenv([name]) */
{
if (lua_isnone(L, 1))
{
extern char **environ;
char **e;
if (*environ==NULL) lua_pushnil(L); else lua_newtable(L);
for (e=environ; *e!=NULL; e++)
{
char *s=*e;
char *eq=strchr(s, '=');
if (eq==NULL) /* will this ever happen? */
{
lua_pushstring(L,s);
lua_pushboolean(L,0);
}
else
{
lua_pushlstring(L,s,eq-s);
lua_pushstring(L,eq+1);
}
lua_settable(L,-3);
}
}
else
lua_pushstring(L, getenv(luaL_checkstring(L, 1)));
return 1;
}
You can install the lua-posix module. Alternatively, RedHat installations have POSIX routines built-in, but to enable them, you have to do a trick:
cd /usr/lib64/lua/5.1/
# (replace 5.1 with your version)
ln -s ../../librpmio.so.1 posix.so
# (replace the "1" as needed)
lua -lposix
> for i, s in pairs(posix.getenv()) do print(i,s,"\n") end
The trick is in creating a soft-link to the RPM's "io" directory and to naming the soft-link the same name of the library LUA will attempt to open. If you don't do this, you get:
./librpmio.so: undefined symbol: luaopen_librpmio
or similar.
local osEnv = {}
for line in io.popen("set"):lines() do
envName = line:match("^[^=]+")
osEnv[envName] = os.getenv(envName)
end
this would not work in some cases, like "no valid shell for the user running your app"
An easy 2 liner:
buf = io.popen("env", '*r')
output = buf:read('*a')
print(output) -- or do whatever