The <<: operator in YAML is usable to import the contents of one mapping into another, similarly to the ** double-splat operator in Python or ... object destructuring operator in JavaScript. For example,
foo:
a: b
<<:
c: d
e: f
is equivalent to
foo:
a: b
c: d
e: f
This is useful when used along with node anchors to include some common default properties in many objects, as illustrated in, for example, the Learn YAML in Y minutes tutorial:
# Anchors can be used to duplicate/inherit properties
base: &base
name: Everyone has same name
foo: &foo
<<: *base
age: 10
bar: &bar
<<: *base
age: 20
However, I am confused about where this syntax comes from or why it works. CTRL + Fing the YAML specification for << reveals that it doesn't appear anywhere in the specification. Yet it's supported by, at the very least, PyYAML and Online YAML Parser.
What is this syntax, and how come it doesn't seem to appear in the specification?
It is called the Merge Key Language-Independent Type for YAML version 1.1. and specified here.
It is something that parsers can optionally support, and it is essentially an interpretation of the key-value pair with the special key <<, where the value is either a mapping (usually indicated via an alias as in the spec, and although that doesn't seem to be required, it makes little sense not to use an alias) or a list of mappings (i.e., aliases of mappings), and gets interpreted in a special way.
To add on to other answers:
IMO, the example from "Learn YAML in Y Minutes" is incomplete, because it doesn't show what happens when the keys are the same. For example:
base: &base
name: Everyone has same name
age: 5
foo: &foo
<<: *base
bar: &bar
<<: *base
age: 20
For the bottom two items, it yields:
foo:
name: Everyone has same name
age: 5
bar:
name: Everyone has same name
age: 20
bar overrides the age while foo does not. According to the spec, the entries of the object merging in have a lower priority than those on the object receiving them.
The “<<” merge key is used to indicate that all the keys of one or more specified maps should be inserted into the current map. If the value associated with the key is a single mapping node, each of its key/value pairs is inserted into the current mapping, unless the key already exists in it.
Related
I'm trying to replace the hardcode in the Helm3 template with variables.
the name of the variable must match the value of the string name from the Values.yaml file. (name: test-rrrr-blabla-file) without spaces and only the last 2 blocks, i.e. blablafile.
Adequate examples in the
https://helm.sh/docs/chart_best_practices/templates/
I didn't find how to do it, tried the following expression:
{{- default .Chart.Name .Values | (split “-” .Values.nameOverride)._3 }}-{{ (split “-” .Values.nameOverride)._4 }}
but it didn't work.
Also found undocumented capabilities here:
https://github.com/Masterminds/sprig/blob/master/docs/strings.md#regexfind
Not sure exactly, maybe need to use or regexSplit
or regex_replace, but I don't understand how to properly compose the expression... maybe you have come across this in practice?
Any help would be appreciated.
Thank you!
In C#, I can do the following:
int #private = 15;
And in VB.NET, I can do the following:
Dim [Private] As Integer = 15
I am wondering if there is a way in F# to use reserved keywords as identifiers, like there is in VB.NET and C#?
Given section 3.4 of the F# 2.0 spec:
Identifiers follow the specification below. Any sequence of characters that is enclosed in double-backtick marks (`` ``), excluding newlines, tabs, and double-backtick pairs themselves, is treated as an identifier.
I suspect you can put it in backticks:
``private``
I haven't tried it though.
I'm using Sphynx and ThinkingSphynx in my Rails project. Is there any way I can make Sphynx stop ignoring spaces so that "Foo " key would find me "Foo Bar" and wouldn't find me, for example, "foo#bar.com" ?
Sphinx is actually treating # as a non-indexed character, same as ., so they become word separators. Thus, "foo#bar.com" is actually indexed as "foo bar com".
If you want "foo#bar.com" to be kept exactly as it is, then you should add # and . to your charset_table value in config/thinking_sphinx.yml for each environment. My example here is the default set, plus the unicode values for those two characters at the end (40 is #, AD is .).
development:
charset_table: 0..9, A..Z->a..z, _, a..z, U+410..U+42F->U+430..U+44F, U+430..U+44F, U+401->U+451, U+451, U+0040, U+00AD
An alternative is to add those characters to the ignore_chars option instead. This way, they're deleted in the indexed data, so "foo#bar.com" becomes "foobarcom":
development:
ignore_chars: U+0040, U+00AD
Keep in mind that both of these settings will impact those characters everywhere in your indexed data, not just in email addresses (and no, you can't set them on a per-field basis).
I've added blend_chars option to indice and everything started working as needed:
set_property :blend_chars => 'U+20, #, .'
U+20 is actually a space.
I am trying to use the "food_descriptions" fixture in a "minitest" test in Rails 4 beta1:
butter:
NDB_No: "01001"
FdGrp_Cd: "0100"
Long_Desc: "Butter, salted"
The test I have is this:
it "must work" do
food_descriptions(:butter).NDB_No.must_equal "01001"
end
However, when I run the test I get this error: Expected: "01001" Actual: 1001
I don't understand why that number is not recognized as a string. I've read that yml treats values that start with 0 as octal values, so adding the quotes should be enough to treat it as a string but is not working. I have also try the pipe "|" sign but doesn't work either.
Any idea why?
Quick Answer (TL;DR)
YAML 1.2 leading zeros can be preserved by quoting a scalar value.
Context
YAML 1.2
Scalar value with leading zeros
Problem
Scenario: Developer wishes to specify a scalar value with leading zero in a YAML 1.2 file.
When parsed, the leading zero gets omitted or truncated
Solution
Quote a scalar value in YAML to have it parsed as a string.
Leading zeros are preserved for non-numeric values.
Pitfalls
Data type casting for databases or programming language context may convert string scalar to numeric scalar value.
It turns out the problem is not what I thought it was (yml). The problem was that the fixtures were being pushed to the DB and the tests were actually retrieving the entry from the database (I thought the fixture were just in memory), and the database column type for that value was integer, not string, thus the leading zeros were being removed. My real problem was that I wanted that column to be the primary key of the table of type string and I didn't realize that the migration I created didn't change the type of the column to string in the test database.
We have a system which is designed for processing social media content. In our storm topology we have some bolts to process, such as sentiment analysis, language detection, spam detection and so on. All tutorials and examples prepared on storm, we have seen that a bolt can emit the tuple fields which has declared in declareOutputFields() methods. Is there any option to emit the current bolt's field with input tuple?
For example i have an input tuple which contains the fields below:
<
text : bla bla
username: paul
date: 01.01.2013
source:twitter
>
I want to define the output tuple as:
<
text : bla bla
username: paul
date: 01.01.2013
source:twitter
lang:tr
>
Note that i want my bolts don't need to know anything about before bolt's output tuple schema.
Thank you.
You could achieve something like this by writing a function that returns a bolt given some input rather than writing the bolt directly. You parameterize the creation of the bolt by writing a function that will return a bolt object with the output fields you want.
Obviously this will have to be done at the time you deploy your topology, so it can't be dynamic on the stream at run time, but it can be dynamic at startup time. Something like
(defn make-bolt [bolt-name input-fields]
(defbolt bolt-name input-fields
...))
....
(topology
{} ;; spouts
{"a-bolt" (bolt-spec {"a-spout":shuffle}
(make-bolt bolt-name ["input" "tuple" "lang"]))))