Unexpected behaviours with TCL/Expect and Cisco - ios
I'm trying to log into a Cisco switch and run a list of commands.
Using the following code, I'm able to log into the device, enable, and configure terminal:
# Connect to single host, enable, and configure
proc connect {host payload username password enablepassword} {
send_user "Connecting to: $host $payload $username $password $enablepassword\n"
spawn ssh -o "StrictHostKeyChecking no" -l $username $host
# # Pardon the rudeness; some switches are upper case, some are lower case
expect "assword:"
send "$password\r"
# Switch to enable mode
expect ">"
send "en\r"
expect "assword:"
send "$enablepassword\r"
expect "*#"
send -- "conf t\r"
expect "config*#"
}
However, using the following code, I get the output below. ($payload contains a file which has one IOS command per line)
proc drop_payload {payload} {
set f [open "$payload"]
set payload [split [read $f] "\n"]
close $f
foreach pld $payload {
send -- "$pld\r"
expect "config*#"
sleep 2
}
}
My expectation is that this loop will iterate over each line in the file, however, the Expect debug (from exp_internal 1) is as follows:
HOST-0001#
expect: does " \r\HOST-0001#" (spawn_id exp7) match glob pattern "*#"? yes
expect: set expect_out(0,string) " \r\nHOST-0001#"
expect: set expect_out(spawn_id) "exp7"
expect: set expect_out(buffer) " \r\nHOST-0001#"
send: sending "conf t\r" to { exp7 }
expect: does "" (spawn_id exp7) match glob pattern "config*#"? no
c
expect: does "c" (spawn_id exp7) match glob pattern "config*#"? no
o
expect: does "co" (spawn_id exp7) match glob pattern "config*#"? no
n
expect: does "con" (spawn_id exp7) match glob pattern "config*#"? no
f
expect: does "conf" (spawn_id exp7) match glob pattern "config*#"? no
expect: does "conf " (spawn_id exp7) match glob pattern "config*#"? no
t
expect: does "conf t" (spawn_id exp7) match glob pattern "config*#"? no
expect: does "conf t\r\n" (spawn_id exp7) match glob pattern "config*#"? no
Enter configuration commands, one per line. End with CNTL/Z.
HOST-0001(config)#
expect: does "conf t\r\nEnter configuration commands, one per line. End with CNTL/Z.\r\nHOST-0001(config)#" (spawn_id exp7) match glob pattern "config*#"? yes
expect: set expect_out(0,string) "configuration commands, one per line. End with CNTL/Z.\r\nHOST-0001(config)#"
expect: set expect_out(spawn_id) "exp7"
expect: set expect_out(buffer) "conf t\r\nEnter configuration commands, one per line. End with CNTL/Z.\r\nHOST-0001(config)#"
}end: sending "no logging 172.x.x.20\r" to { exp0 no logging 172.x.x.20
expect: does "" (spawn_id exp0) match glob pattern "config*#"? no
expect: timed out
}end: sending "no logging 172.x.x.210\r" to { exp0 no logging 172.x.x.210
expect: does "" (spawn_id exp0) match glob pattern "config*#"? no
expect: timed out
}end: sending "no logging 172.x.x.9\r" to { exp0 no logging 172.x.x.9
expect: does "" (spawn_id exp0) match glob pattern "config*#"? no
expect: timed out
}end: sending "no logging 172.x.x.210\r" to { exp0 no logging 172.x.x.210
expect: does "" (spawn_id exp0) match glob pattern "config*#"? no
expect: timed out
}end: sending "no logging 172.x.x.20\r" to { exp0 no logging 172.x.x.20
expect: does "" (spawn_id exp0) match glob pattern "config*#"? no
expect: timed out
}end: sending "logging 172.x.x.50\r" to { exp0 logging 172.x.x.50
expect: does "" (spawn_id exp0) match glob pattern "config*#"? no
expect: timed out
I'm confused as to why it's trying to expect "conf t" which is being sent to the host; not received.
I'm also confused as to why any of the commands end after conf t is applied don't hit the switch, and time out instead.
You can try sending the configurations with the spwan_id
spawn ssh -o "StrictHostKeyChecking no" -l $username $host
#After process creation the process id will be saved in
#standard expect variable'spawn_id'
#Copying it to variable 'id'
set id $spawn_id
Now the variable 'id' is holding the reference to the ssh process. We can very well use the send and expect with the spawn id.
#Now we are setting the spawn id to our ssh process to make sure
#we are sending the commands to right process
#You can pass this variable 'id' as arg in 'drop_payload'
set spawn_id $id
foreach pld $payload {
send -- "$pld\r"
expect "config*#"
sleep 2
}
Or the other way around is as follows,
foreach pld $payload {
#This way is useful, when u want to send and expect to multiple process
#simultaneously.
send -i $id "$pld\r"
expect -i $id "config*#"
sleep 2
}
I found that each function/procedure was outputting to a new spawn ID.
One method is to follow Dinesh's advice and explicitly define the spawn id.
My workaround was to simply stuff everything into a single output procedure.
Related
Expect getting stuck when inside a foreach after first iteration
Output is stuck after the first iteration. Works fine when only one expect exists within the loop. Using exp_continue as well fails when used. #!/usr/bin/expect -f exp_internal 1 set timeout -1 set passwords [list foo bar test] set connected false set passwordUsed "test" spawn ssh -oHostKeyAlgorithms=+ssh-dss root#192.168.1.136 -y foreach i $passwords { expect "assword:" { send -- "$i\r";} expect "asd" {send "test"} } expect eof Output: spawn ssh -oHostKeyAlgorithms=+ssh-dss root#192.168.1.136 -y root#192.168.1.136's password: root#192.168.1.136's password: debug parent: waiting for sync byte parent: telling child to go ahead parent: now unsynchronized from child spawn: returns {1937} expect: does "" (spawn_id exp4) match glob pattern "assword:"? no expect: does "\r" (spawn_id exp4) match glob pattern "assword:"? no expect: does "\rroot#192.168.1.136's password: " (spawn_id exp4) match glob pattern "assword:"? yes expect: set expect_out(0,string) "assword:" expect: set expect_out(spawn_id) "exp4" expect: set expect_out(buffer) "\rroot#192.168.1.136's password:" send: sending "foo\r" to { exp4 } expect: does " " (spawn_id exp4) match glob pattern "asd"? no expect: does " \r\n" (spawn_id exp4) match glob pattern "asd"? no expect: does " \r\n\rroot#192.168.1.136's password: " (spawn_id exp4) match glob pattern "asd"? no Then hangs on the last expect.
The timeout value is the problem: expect is waiting forever to see "asd" You need something like this (untested) set passwords {foo bar baz} set idx 0 spawn ... expect { "assword:" { if {$idx == [llength $passwords]} { error "none of the passwords succeeded" } send "[lindex $passwords $idx]\r" incr idx exp_continue } "asd" {send "test"} } expect eof exp_continue loops within the expect command to wait for "asd" or for "assword" to appear again. The "asd" case does not use exp_continue, so after sending "test", this expect commend ends.
Spirit: Allowing a character at the begining but not in the middle
I'm triying to write a parser for javascript identifiers so far this is what I have: // All this rules have string as attribute. identifier_ = identifier_start >> *( identifier_part >> -(qi::char_(".") > identifier_part) ) ; identifier_part = +(qi::alnum | qi::char_("_")); identifier_start = qi::char_("a-zA-Z$_"); This parser work fine for the list of "good identifiers" in my tests: "x__", "__xyz", "_", "$", "foo4_.bar_3", "$foo.bar", "$foo", "_foo_bar.foo", "_foo____bar.foo" but I'm having trouble with one of the bad identifiers: foo$bar. This is supposed to fail, but it success!! And the sintetized attribute has the value "foo". Here is the debug ouput for foo$bar: <identifier_> <try>foo$bar</try> <identifier_start> <try>foo$bar</try> <success>oo$bar</success> <attributes>[[f]]</attributes> </identifier_start> <identifier_part> <try>oo$bar</try> <success>$bar</success> <attributes>[[f, o, o]]</attributes> </identifier_part> <identifier_part> <try>$bar</try> <fail/> </identifier_part> <success>$bar</success> <attributes>[[f, o, o]]</attributes> </identifier_> What I want is to the parser fails when parsing foo$bar but not when parsing $foobar. What I'm missing?
You don't require that the parser needs to consume all input. When a rule stops matching before the $ sign, it returns with success, because nothing says it can't be followed by a $ sign. So, you would like to assert that it isn't followed by a character that could be part of an identifier: identifier_ = identifier_start >> *( identifier_part >> -(qi::char_(".") > identifier_part) ) >> !identifier_start ; A related directive is distinct from the Qi repository: http://www.boost.org/doc/libs/1_55_0/libs/spirit/repository/doc/html/spirit_repository/qi_components/directives/distinct.html
grep wildcards inside file
I am using the following code to filter SPAM on my mail server: if 822field from | grep -qiFf "/email_filters/from.txt"; then exit 99; fi and also: if 822field subject | grep -qiFf "email_filters/bad_subject.txt"; then exit 99; fi Inside of the two above-referenced files, I have lists of SPAM email sources and subjects (e.g., pfizer, cialis, viagra, etc.), each on its own line in the text file. How can I have wildcards within the file? For example, I have recently begun receiving fake U.S. Postal Service notifications from status_38#unikmetal.ru", so I'd like to block "#_.ru".
When you add -F to grep, it processes a fixed string not a regular expression. To use wildcards you must use regular expressions as far as I know. Remove the -F in the grep command grep -qif "/email_filters/from.txt" To block your russian email addresses you can add something like this to your filters #.*\.ru Explanation # - match '#' character literally .* - match any character 0 or more times \. - match '.' literally ru - match 'ru' literally
Parsing log lines using awk
I have to parse some information out of big log file lines. Its something like abc.log:2012-03-03 11:12:12,457 ABC[123.RPH.-101] XYZ: Query=get_data #a=0,#b=1 Rows=10Time=100 There are many log lines like above in the logfiles. I need to extract information like datetime i.e. 2012-03-03 11:12:12,457 job details i.e. 123.RPH.-101 Query i.e. get_data (no parameters) Rows i.e. 10 Time i.e. 100 So output should look like 2012-03-03 11:12:12,457|123|-101|get_data|10|100 I have tried various permutation computations with awk but not getting it right.
Well, this is really horrible, but since sed is in the tags and there are no answers yet... sed -e 's/[^0-9]*//' -re 's/[^ ]*\[([^.]*)\.[^.]*\.([^]]*)\]/| \1 | \2/' -e 's/[^ ]* Query=/| /' -e 's/ [^ ]* Rows=/ | /' -e 's/Time=/ | /' my_logfile
My solution in gawk: it uses gawk extension to match. You didn't give specification of file format, so you may have to adjust the regexes. Script invocation: gawk -v OFS='|' -f script.awk { match($0, /[0-9]+-[0-9]+-[0-9]+ [0-9]+:[0-9]+:[0-9]+,[0-9]+/) date_time = substr($0, RSTART, RLENGTH) match($0, /\[([0-9]+).RPH.(-?[0-9]+)\]/, matches) job_detail_1 = matches[1] job_detail_2 = matches[2] match($0, /Query=(\w+)/, matches) query = matches[1] match($0, /Rows=([0-9]+)/, matches) rows = matches[1] match($0, /Time=([0-9]+)/, matches) time = matches[1] print date_time, job_detail_1, job_detail_2, query,rows, time }
Here's another, less fancy, AWK solution (but works in mawk too): BEGIN { OFS="|" } { i = match($3, /\[[^]]+\]/) job = substr($3, i + 1, RLENGTH - 2) split($5, X, "=") query = X[2] split($7, X, "=") rows = X[2] split($8, X, "=") time= X[2] print $1 " " $2, job, query, rows, time } Nothe that this assumes the Rows=10 and Time=100 strings are separated by space, that is, there was a typo in the question example.
TXR: #(collect :vars ()) #file:#year-#mon-#day #hh:#mm:#ss,#ms #jobname[#job1.RPH.#job2] #queryname: Query=#query #params Rows=#{rows /[0-9]+/}Time=#time #(output) #year-#mon-#day #hh-#mm-#ss,#ms|#job1|#job2|#query|#rows|#time #(end) #(end) Run: $ txr data.txr data.log 2012-03-03 11-12-12,457|123|-101|get_data|10|100 Here is one way to make the program assert that every line in the log file must match the pattern. First, do not allow gaps in the collection. This means that nonmatching material cannot be skipped to just look for the lines which match: #(collect :gap 0 :vars ()) Secondly, at the end of the script we add this: #(eof) This specifies a match on the end of file. If the #(collect) bails early because of a nonmatching line (due to the :gap 0 constraint), the #(eof) will fail and so the script will terminate with a failed status. In this type of task, field splitting regex hacks will backfire because they can blindly produce incorrect results for some subset of the input being processed. If the input contains a vast number of lines, there is no easy way to check for mistakes. It's best to have a very specific match that is likely to reject anything which doesn't resemble the examples on which the pattern is based.
Just need the right field separators awk -F '[][ =.]' -v OFS='|' '{print $1 " " $2, $4, $6, $10, $15, $17}' I'm assuming the "abc.log:" is not actually in the log file.
How to genericize this parse rule to include any verbs for this UML Dialect?
I want to make a PARSE rule (use-rule) for including several verbs: Connect, Use, List, Show, etc. use-rule: [ some [ copy Actor to 'Connect thru 'Connect 'to copy UseCase to end ( append output rejoin ["[" Actor "]-(" "Connect to " UseCase ")"] ) ] | [ copy Actor to 'Use thru 'Use copy UseCase to end ( append output rejoin ["[" Actor "]-(" "Use " UseCase ")"] ) ] | [ copy Actor to 'List thru 'List copy UseCase to end ( append output rejoin ["[" Actor "]-(" "List " UseCase ")"] ) ] | ;; ... ;; same for Show, Search, Select, Checkout, Pay, Delete, Modify, Add, Manage ;; ... ] How can I make it generic, so it accepts any verbs? Something like: [ copy Actor to 'Any-Verb thru 'Any-Verb copy UseCase to end ( append output rejoin ["[" Actor "]-(" "Any-Verb " UseCase ")"] ) ] This way I'd not need to add a new section to the rule each time I need a new verb? (Note: That rule is part of a global parse rule used here http://askuml.com/blog/e-commerce/)
Rather than do that I'd prefer to write a function that takes all the verbs as the input and to generate the parse rule for you. So, if there's a new verb, you just add it to the list of verbs rather than modify the rule. And that would avoid errors too ... is your 2nd to last parse rule an error?