My language is lua.
Many functions in the file table.lua is named as "Table_XXX",like pictue1.In table.lua's symbol window,all they are marked as the same symbol----"Table",like picture 1.while function call in other files(enven in the same file),Source Insight can not recognizes the function and jump to its definition.
so,what should I do to solve this problem?
thanks a lot.
Unfortunately you can't expect any help from SI folks so it will have to be a DIY thing.
Open Lua.xclf (it's XML, you can even drag&drop it into SI itself).
Notice:
<Expression
SymbolType="Function"
Pattern="function\w+\([a-zA-Z][a-zA-Z0-9]*\)"
RegexType="Source Insight"
/>
and that regex function\w+\([a-zA-Z][a-zA-Z0-9]*\) doesn't have [ _ ] char. If you are already playing, do yourself a favor and switch type to "Perl Compatible" to have much better control.
You can also edit that in Options / Preferences / Languages double click the language and Custom Parsing.
Also notice that in Keywords [function] is declared as "control" and you'd probably want "Declare Function" or "Declare Method".
You'll have to scrutinize it in great detail (wither in XML or UI) and you'll probably find many other problems. If you reach a good state you'll have to publish it on your own (say on https://pastebin.com/) and then publish the link here.
In theory you could write to their support to ask them do they want a better file but don't expect much. SI ended up being so neglected that you could start crying if you remember it's golden days (whole Windows shipped with SI - several times). That's why people gave up. So, now it's DIY - find some highlighter files from other editors and copy regexs is you can.
Maybe IBM could buy them - right after they pay for Red Hat :-)
Related
I know my question is rather generic (and it looks like "please do all of my work for me"), so let me make it somewhat clearer: I'm - more or less - a COBOL beginner, the only thing I've done with it so far was a small FastCGI application for a single-serving page, just to have done something with it.
Now I'm considering to write a small file server in GnuCOBOL so I have something real to work with. I tend to learn new languages by writing stuff in them. While I do have an idea about how to read and process a specific file now, I still could need a clue about how I can collect and handle a specified directory's contents.
Sadly, the system calls C$LIST-DIRECTORY,x"91" function 69, CBL_DIR_SCAN_START and its sibling methods are still on the GnuCOBOL Wish List, so I can't just adapt existing solutions from the commercial COBOLs. I'm somewhat lost here.
call "system" using "dir /b > fileslist.txt" end-call
And then read in the listing file ...
Company i work develops a new programming language which will ease job of engineer. My job is to supply this language with a nice editor which is also involves code folding. I need to have custom code folding which is not include "{" and "}". I am working with Geany filetypes. I add new filetype. I want to fold some structure like below.
if %condition% then for each %element% in %range% do
%statement% %statement%
else if %condition% then end for
%statement%
else
end if
I know my language far from c type , however add such line to my code for enabling syntax coloring.
[settings]
lexer_filetype=C
Any kind of help will be appreciated.
I dont know exact answer but i know how i can dig it up. As far there is no an answer i am going to write how can the answer can be appeared. Using scintilla and its lexers can take us to solution of this problem. Both Geany and Scintilla documentations mention about support of that feature.
Under Debian :
cp /usr/share/geany/filetypes.c ~/.config/geany/filedefs/
chown myUser:myGroup ~/.config/geany/filedefs/filetypes.c
Edit the file. Under the section [lexer_properties] add the line:
fold.cpp.comment.explicit=1
Save the file.
Open geany. You are now able to put userfoldings using the default //{ and //} delimiters in c and in cpp. These do not influence your code because to c and cpp it are comments.
Is it possible to unmangle names like these in Delphi?
If so, where do I get more information?
Example of an error message where it cannot find a certain entry in the dbrtl100.bpl
I want to know which exact function it cannot find (unit, class, name, parameters, etc).
---------------------------
myApp.exe - Entry Point Not Found
---------------------------
The procedure entry point #Dbcommon#GetTableNameFromSQLEx$qqrx17System#WideString25Dbcommon#IDENTIFIEROption could not be located in the dynamic link library dbrtl100.bpl.
---------------------------
OK
---------------------------
I know it is the method GetTableNameFromSQLEx in the Dbcommon unit (I have Delphi with the RTL/VCL sources), but sometimes I bump into apps where not all code is available for (yes, clients should always buy all the source code for 3rd party stuff, but sometimes they don't).
But say this is an example for which I do not have the code, or only the interface files (BDE.INT anyone?)
What parameters does it have (i.e. which potential overload)?
What return type does it have?
Is this mangling the same for any Delphi version?
--jeroen
Edit 1:
Thanks to Rob Kennedy: tdump -e dbrtl100.bpl does the trick. No need for -um at all:
C:\WINDOWS\system32>tdump -e dbrtl100.bpl | grep GetTableNameFromSQLEx
File STDIN:
00026050 1385 04AC __fastcall Dbcommon::GetTableNameFromSQLEx(const System::WideString, Dbcommon::IDENTIFIEROption)
Edit 2:
Thanks to TOndrej who found this German EDN article (English Google Translation).
That article describes the format pretty accurately, and it should be possible to create some Delphi code to unmangle this.
Pitty that the website the author mentions (and the email) are now dead, but good to know this info.
--jeroen
There is no function provided with Delphi that will unmangle function names, and I'm not aware of it being documented anywhere. Delphi in a Nutshell mentions that the "tdump" utility has a -um switch to make it unmangle symbols it finds. I've never tried it.
tdump -um -e dbrtl100.bpl
If that doesn't work, then it doesn't look like a very complicated scheme to unmangle yourself. Evidently, the name starts with "#" and is followed by the unit name and function name, separated by another "#" sign. That function name is followed by "$qqrx" and then the parameter types.
The parameter types are encoded using the character count of the type name followed by the same "#"-delimited format from before.
The "$" is necessary to mark the end of the function name and the start of the parameter types. The remaining mystery is the "qqrx" part. That's revealed by the article Tondrej found. The "qqr" indicates the calling convention, which in this case is register, a.k.a. fastcall. The "x" applies to the parameter and means that it's constant.
The return type doesn't need to be encoded in the mangled function name because overloading doesn't consider return types anyway.
Also see this article (in German).
I guess the mangling is probably backward-compatible, and new mangling schemes are introduced in later Delphi versions for new language features.
If you have C++Builder, check out $(BDS)\source\cpprtl\Source\misc\unmangle.c - it contains the source code for the unmangling mechanism used by TDUMP, the debugger and the linker. (C++Builder and Delphi use the same mangling scheme.)
From the Delphi 2007 source files:
function GetTableNameFromSQLEx(const SQL: WideString; IdOption: IDENTIFIEROption): WideString;
This seems to be the same version, since I also have the same .BPL in my Windows\System32 folder.
Source can be found in [Program Files folders]\CodeGear\RAD Studio\5.0\source\Win32\db
Borland/Codegear/Embarcadero has used this encoding for a while now and never gave many details about the .BPL format. I've never been very interested in them since I hate using runtime libraries in my projects. I prefer to compile them into my projects, although this will result in much bigger executables.
i'm searching for a pretty print program (script, code, whatever) for Informix-4GL sources.
Do you know any ? Than you, Peter.
Have you looked at the IIUG (International Informix User Group) software archive? There are two pretty printers there (of indeterminate quality).
The other place to look would be the Aubit4GL site - an open source variant of I4GL. Again, I'm not sure that they have a pretty-printer, but it might be something they have (though a casual check doesn't show one).
I don't know if anyone is reading this post anymore, but the easiest way to get some kind of nice "pretty print" of 4gl code is to view it in the Openedge Developer Studio, then use ctrl-I to set indention. You can adjust indention in the editor settings by saying the length of "tabs". (default is 4, I use 3)
Then do a ctrl-shift-f to make all command words uppercase.
Next, you can condense the code a few lines by moving all the "DO:" statements up a line next to the "THEN" statement with this regular expression search and replace.
ctrl-f:
search "\s*\n\s*DO[:]"
replace " DO:"
make sure you click the checkbox marked regular expressions.
At this point the code is nice and tidy.
Do a ctrl-a and ctrl-c to copy it to the clipboard.
paste it in Outlook as an email without sending. Print it in color.
I need a way to add text comments in "Word style" to a Latex document. I don't mean to comment the source code of the document. What I want is a way to add corrections, suggestions, etc. to the document, so that they don't interrupt the text flow, but that would still make it easy for everyone to know, which part of the sentence they are related to. They should also "disappear" when compiling the document for printing.
At first, I thought about writing a new command, that would just forward the input to \marginpar{}, and when compiling for printing would just make the definition empty. The problem is you have no guarantee where the comments will appear and you will not be able to distinguish them from the other marginpars.
Any idea?
todonotes is another package that makes nice looking callouts. You can see a number of examples in the documentation.
Since LaTeX is a text format, if you want to show someone the differences in a way that they can use them (and cherry pick from them) use the standard diff tool (e.g., diff -u orig.tex new.tex > docdiffs). This is the best way to annotate something like LaTeX documents, and can be easily used by anyone involved in the production of a document from LaTeX sources. You can then use standard LaTeX comments in your patch to explain the changes, and they can be very easily integrated. If the document lives in a version control system of some sort, just use the VCS to generate a patch file that can be reviewed.
I have used changes.sty, which gives basic change colouring:
\added{new text}
\deleted{old text}
\replaced{new text}{old text}
All of these take an optional parameter with the initials of the author who did this change. This results in different colours used, and these initials are displayed superscripted after the changed text.
\replaced[MI]{new text}{old text}
You can hide the change marks by giving the option final to the changes package.
This is very basic, and comments are not supported, but it might help.
My little home-rolled "fixme" tool uses \marginpar where possible and goes inline in places (like captions) where that is hard to arrange. This works out because I don't often use margin paragraphs for other things. This does mean you can't finalize the layout until everything is fixed, but I don't feel much pain from that...
Other than that I heartily agree with Michael about using standard tools and version control.
See also:
Tips for collaboratively editing a LaTeX document (which addresses you main question...)
https://stackoverflow.com/questions/193298/best-practices-in-latex
and a self-plug:
How do I get Emacs to fill sentences, but not paragraphs?
You could also try the trackchanges package.
You can use the changebar package to highlight areas of text that have been affected.
If you don't want to do the markup manually (which can be tedious and interrupt the flow of editing) the neat latexdiff utility will take a diff of your document and produce a version of it with markup added to visually display the changes between the two versions in the typeset output.
This would be my preferred solution, although I haven't tested it out on large, multi-file documents.
The best package I know is Easy Review that provides the commenting functionality into LaTeX environment. For example, you can use the following simple commands such as \add{NEW TEXT}, \remove{OLD TEXT}, \replace{OLD TEXT}{NEW TEXT}, \comment{TEXT}{COMMENT}, \highlight{TEXT}, and \alert{TEXT}.
Some examples can be found here.
The todonotes package looks great, but if that proves too cumbersome to use, a simple solution is just to use footnotes (e.g. in red to separate them from regular footnotes).
Package trackchanges.sty works exactly the way changes.sty. See #Svante's reply.
It has easy to remember commands and you can change how edits will appear after compiling the document. You can also hide the edits for printing.