I'm trying to add a place within app scope via google place API. For this I'm using golang. But I keep on getting no result, there is no error message.
Here is my code
```
type latlng struct {
lat, lng float64
}
type newPlace struct {
location latlng
accuracy int
name string
phone_number string
address string
types string
}
func main() {
requestUrl := "https://maps.googleapis.com/maps/api/place/add/json?key=<MYAPIKEY>"
obj := newPlace{
location: latlng{
lat: 52.1502824,
lng: 38.2643063,
},
name: "some field",
types: "storage",
}
bodyBytes, err := json.Marshal(&obj)
if err != nil {
panic(err)
}
body := bytes.NewReader(bodyBytes)
rsp, err := http.NewRequest("POST", requestUrl, body)
if err != nil {
log.Fatal(err)
}
defer rsp.Body.Close()
body_byte, err := ioutil.ReadAll(rsp.Body)
if err != nil {
panic(err)
}
fmt.Println(string(body_byte))
}
```
Here is the documentation that I followed. https://developers.google.com/places/web-service/add-place
I'm a bit new to golang, any help would be much appreciated.
FYI I wrote this article on this touchy topic (JSON data encoded into a POST body request in Go).
You're missing 4 things here:
the http.Client creation. Then you need to execute the request you're preparing with http.NewRequest by using client.Do.
add json fields to your struct and export variables contained in struct by capitalizing variables's first letters
set Content-Type to application/json
Google is expecting an array instead of a string in types, so I replace with an array containing 1 string (but you should adapt this depending on how many types you want to pass to Google)
Here is a working script:
type latlng struct {
Lat float64 `json:"lat"`
Lng float64 `json:"lng"`
}
type newPlace struct {
Location latlng `json:"location"`
Accuracy int `json:"accuracy"`
Name string `json:"name"`
PhoneNumber string `json:"phone_number"`
Address string `json:"address"`
Types [1]string `json:"types"`
}
func main() {
requestUrl := "https://maps.googleapis.com/maps/api/place/add/json?key=<your key>"
types := [1]string{"storage"}
obj := newPlace{
Location: latlng{
Lat: 52.1502824,
Lng: 38.2643063,
},
Name: "some field",
Types: types,
}
bodyBytes, err := json.Marshal(&obj)
if err != nil {
fmt.Println(err)
}
body := bytes.NewReader(bodyBytes)
client := &http.Client{}
req, err := http.NewRequest("POST", requestUrl, body)
req.Header.Add("Content-Type", "application/json")
if err != nil {
fmt.Println(err)
}
rsp, err := client.Do(req)
defer rsp.Body.Close()
body_byte, err := ioutil.ReadAll(rsp.Body)
if err != nil {
fmt.Println(err)
}
fmt.Println(string(body_byte))
}
Hope it's working now !
You're trying to marshall objects to JSON which have no exported fields, so the resulting JSON document is empty. Per the JSON documentation, it will only marshall exported fields (those whose names begin with a capital letter). Try:
type latlng struct {
Lat float64 `json:"lat"`
Lng float64 `json:"lng"`
}
type newPlace struct {
Location latlng `json:"location"`
Accuracy int `json:"accuracy"`
Name string `json:"name"`
PhoneNumber string `json:"phone_number"`
Address string `json:"address"`
Types string `json:"types"`
}
Related
My proto buf format is this:
package main;
message Test {
optional string id = 1;
optional string name = 2;
optional string age = 3;
}
Then I populate the protobuf files from the input in golang using the following code. str is already parsed.
test = &Test{
id: proto.String(str[0]),
name: proto.String(str[1]),
age: proto.String(str[2]),
},
One condition I have to handle is that one or more optional fields in the Test structure could be absent randomly and I do not know that in advance. How do I handle that in golang?
To give more context, the real data can look like these in the file:
id=1, name=peter, age=24
id=2, age=30
name=mary, age=31
id=100
name=bob
age=11
You could use a regular expression to change your input strings into valid JSON, the use the encoding/json package to parse it. This has the advantage of letting the json parser take care of everything for you. Here is the regex for your particular case.
Basically, the regex looks for field=value and replaces with "field" : "value" and wraps it in {} to create valid JSON. The commas are left as is.
https://play.golang.org/p/_EEdTB6sve
package main
import (
"encoding/json"
"errors"
"fmt"
"log"
"regexp"
)
var ins = []string{
`id=1, name=peter, age=24`,
`id=2, age=30`,
`name=mary, age=31`,
`id=100`,
`name=bob`,
`age=11`,
}
var ParseError = errors.New("bad parser input")
var Regex *regexp.Regexp
type Test struct {
ID string
Name string
Age string
}
func (t *Test) String() string {
return fmt.Sprintf("ID: %s, Name: %s, Age: %s", t.ID, t.Name, t.Age)
}
func main() {
var err error
Regex, err = regexp.Compile(`([^,\s]*)=([^,\s]*)`)
if err != nil {
log.Panic(err)
}
for _, v := range ins {
test, err := ParseLine(v)
if err != nil {
log.Panic(err)
}
fmt.Println(test)
}
}
func ParseLine(inp string) (*Test, error) {
JSON := fmt.Sprintf("{%s}", Regex.ReplaceAllString(inp, `"$1" : "$2"`))
test := &Test{}
err := json.Unmarshal([]byte(JSON), test)
if err != nil {
return nil, err
}
return test, nil
}
Here is what I believe to be a minimum working case for what you are after, though I am not familiar enough with protocol buffers to get the strings to print right... or even verify if they are correct. Note that this doesn't run in the playground.
package main
import (
"errors"
"fmt"
"log"
"regexp"
"github.com/golang/protobuf/jsonpb"
_ "github.com/golang/protobuf/proto"
)
var ins = []string{
`id=1, name=peter, age=24`,
`id=2, age=30`,
`name=mary, age=31`,
`id=100`,
`name=bob`,
`age=11`,
}
var ParseError = errors.New("bad parser input")
var Regex *regexp.Regexp
type Test struct {
ID *string `protobuf:"bytes,1,opt,name=id,json=id" json:"id,omitempty"`
Name *string `protobuf:"bytes,2,opt,name=name,json=name" json:"name,omitempty"`
Age *string `protobuf:"bytes,3,opt,name=age,json=age" json:"age,omitempty"`
}
func (t *Test) Reset() {
*t = Test{}
}
func (*Test) ProtoMessage() {}
func (*Test) Descriptor() ([]byte, []int) {return []byte{}, []int{0}}
func (t *Test) String() string {
return fmt.Sprintf("ID: %v, Name: %v, Age: %v", t.ID, t.Name, t.Age)
}
func main() {
var err error
Regex, err = regexp.Compile(`([^,\s]*)=([^,\s]*)`)
if err != nil {
log.Panic(err)
}
for _, v := range ins {
test, err := ParseLine(v)
if err != nil {
fmt.Println(err)
log.Panic(err)
}
fmt.Println(test)
}
}
func ParseLine(inp string) (*Test, error) {
JSON := fmt.Sprintf("{%s}", Regex.ReplaceAllString(inp, `"$1" : "$2"`))
test := &Test{}
err := jsonpb.UnmarshalString(JSON, test)
if err != nil {
return nil, err
}
return test, nil
}
Looks like you can write the parser for each line of your input something like the following.
NOTE: I didn't make the struct with proto values because as an external package, it can't be imported in the playground.
https://play.golang.org/p/hLZvbiMMlZ
package main
import (
"errors"
"fmt"
"strings"
)
var ins = []string{
`id=1, name=peter, age=24`,
`id=2, age=30`,
`name=mary, age=31`,
`id=100`,
`name=bob`,
`age=11`,
}
var ParseError = errors.New("bad parser input")
type Test struct {
ID string
Name string
Age string
}
func (t *Test) String() string {
return fmt.Sprintf("ID: %s, Name: %s, Age: %s", t.ID, t.Name, t.Age)
}
func main() {
for _, v := range ins {
t, err := ParseLine(v)
if err != nil {
fmt.Println(err)
} else {
fmt.Println(t)
}
}
}
func ParseLine(inp string) (*Test, error) {
splt := strings.Split(inp, ",")
test := &Test{}
for _, f := range splt {
fieldVal := strings.Split(strings.TrimSpace(f), "=")
switch strings.TrimSpace(fieldVal[0]) {
case "id":
test.ID = strings.TrimSpace(fieldVal[1])
case "name":
test.Name = strings.TrimSpace(fieldVal[1])
case "age":
test.Age = strings.TrimSpace(fieldVal[1])
default:
return nil, ParseError
}
}
return test, nil
}
I'm trying to unmarshall some XML which is structured like the following example:
<player>
<stat type="first_name">Somebody</stat>
<stat type="last_name">Something</stat>
<stat type="birthday">06-12-1987</stat>
</player>
It's dead easy to unmarshal this into a struct like
type Player struct {
Stats []Stat `xml:"stat"`
}
but I'm looking to find a way to unmarshal it into a struct that's more like
type Player struct {
FirstName string `xml:"stat[#Type='first_name']"`
LastName string `xml:"stat[#Type='last_name']"`
Birthday Time `xml:"stat[#Type='birthday']"`
}
is there any way to do this with the standard encoding/xml package?
If not, can you give me a hint how one would split down such a "problem" in go? (best practices on go software architecture for such a task, basically).
thank you!
The encoding/xml package doesn't implement xpath, but does have a simple set of selection methods it can use.
Here's an example of how you could unmarshal the XML you have using encoding/xml. Because the stats are all of the same type, with the same attributes, the easiest way to decode them will be into a slice of the same type. http://play.golang.org/p/My10GFiWDa
var doc = []byte(`<player>
<stat type="first_name">Somebody</stat>
<stat type="last_name">Something</stat>
<stat type="birthday">06-12-1987</stat>
</player>`)
type Player struct {
XMLName xml.Name `xml:"player"`
Stats []PlayerStat `xml:"stat"`
}
type PlayerStat struct {
Type string `xml:"type,attr"`
Value string `xml:",chardata"`
}
And if it's something you need to transform often, you could do the transformation by using an UnamrshalXML method: http://play.golang.org/p/htoOSa81Cn
type Player struct {
XMLName xml.Name `xml:"player"`
FirstName string
LastName string
Birthday string
}
func (p *Player) UnmarshalXML(d *xml.Decoder, start xml.StartElement) error {
for {
t, err := d.Token()
if err == io.EOF {
break
} else if err != nil {
return err
}
if se, ok := t.(xml.StartElement); ok {
t, err = d.Token()
if err != nil {
return err
}
var val string
if c, ok := t.(xml.CharData); ok {
val = string(c)
} else {
// not char data, skip for now
continue
}
// assuming we have exactly one Attr
switch se.Attr[0].Value {
case "first_name":
p.FirstName = val
case "last_name":
p.LastName = val
case "birthday":
p.Birthday = val
}
}
}
return nil
}
I'm relatively new to Go, I'm struggling to get my head around using POST data with structs. What I essentially want to do is submit a form, then submit that form to MongoDB (haven't got that far yet). I can't work out how to use this form data with a struct.
package main
import "net/http"
type Paste struct {
Title string
Content string
Language string
Expires int
Created int64
}
func index(w http.ResponseWriter, r *http.Request) {
if r.Method == "POST" {
r.ParseForm()
// r.Form = map[title:[Wadup] content:[Brother]]
}
}
func main() {
http.HandleFunc("/", index)
http.ListenAndServe(":1234", nil)
}
What I basically want to do is insert the map values into the struct, without manually assigning all of them, like you can see here: p := Paste{Title: r.Form["title"]}
Use gorilla/schema, which was built for this use-case:
package main
import(
"net/http"
"github.com/gorilla/schema"
)
type Paste struct {
Title string
Content string
Language string
Expires int
Created int64
}
var decoder = schema.NewDecoder()
func index(w http.ResponseWriter, r *http.Request) {
err := r.ParseForm()
// handle error
var paste = &Paste{}
err := decoder.Decode(paste, r.PostForm)
// handle error
}
After you called r.ParseForm() you can access r.Form which holds a map[string][]string of the parsed form data. This map can be accessed by using keys (your form field names):
r.ParseForm()
form := r.Form
someField := form["someField"]
anotherField := form["anotherField"]
or loop through all available keys/ values:
r.ParseForm()
form := r.Form
for key, value := range form {
fmt.Println("Key:", key, "Value:", value)
}
Update
In case you want a more generic solution take a look at the reflect package. Just to give you a rough example it could look like this:
v := url.Values{}
v.Set("Title", "Your Title")
v.Set("Content", "Your Content")
v.Set("Language", "English")
v.Set("Expires", "2015")
v.Set("Created", "2014")
paste := Paste{}
ps := reflect.ValueOf(&paste)
s := ps.Elem()
for key, value := range v {
f := s.FieldByName(key)
if f.IsValid() && f.CanSet() {
switch f.Kind() {
case reflect.String:
f.SetString(value[0])
case reflect.Int64:
i, _ := strconv.ParseInt(value[0], 0, 64)
f.SetInt(i)
}
}
}
Play
It seems that URL does not support matrix parameters
// From net/url
type URL struct {
Scheme string
Opaque string // encoded opaque data
User *Userinfo // username and password information
Host string // host or host:port
Path string
RawQuery string // encoded query values, without '?'
Fragment string // fragment for references, without '#'
}
Why ?
How can I extract matrix parameters from an URL ? and when should I use them instead of using requests parameters embedded in the request.URL.RawQuery part of the URL ?
The parameters end up getting put in url.Path. Here's a function which can put them in the Query for you:
func ParseWithMatrix(u string) (*url.URL, error) {
parsed, err := url.Parse(u)
if err != nil {
return nil, err
}
if strings.Contains(parsed.Path, ";") {
q := parsed.Path[strings.Index(parsed.Path, ";")+1:]
m, err := url.ParseQuery(q)
if err != nil {
return nil, err
}
for k, vs := range parsed.Query() {
for _, v := range vs {
m.Add(k, v)
}
}
parsed.Path = parsed.Path[:strings.Index(parsed.Path, ";")]
parsed.RawQuery = m.Encode()
}
return parsed, nil
}
I am trying to read the assocated Doc comments on a struct type using Go’s parser and ast packages. In this example, the code simply uses itself as the source.
package main
import (
"fmt"
"go/ast"
"go/parser"
"go/token"
)
// FirstType docs
type FirstType struct {
// FirstMember docs
FirstMember string
}
// SecondType docs
type SecondType struct {
// SecondMember docs
SecondMember string
}
// Main docs
func main() {
fset := token.NewFileSet() // positions are relative to fset
d, err := parser.ParseDir(fset, "./", nil, parser.ParseComments)
if err != nil {
fmt.Println(err)
return
}
for _, f := range d {
ast.Inspect(f, func(n ast.Node) bool {
switch x := n.(type) {
case *ast.FuncDecl:
fmt.Printf("%s:\tFuncDecl %s\t%s\n", fset.Position(n.Pos()), x.Name, x.Doc)
case *ast.TypeSpec:
fmt.Printf("%s:\tTypeSpec %s\t%s\n", fset.Position(n.Pos()), x.Name, x.Doc)
case *ast.Field:
fmt.Printf("%s:\tField %s\t%s\n", fset.Position(n.Pos()), x.Names, x.Doc)
}
return true
})
}
}
The comment docs for the func and fields are output no problem, but for some reason the ‘FirstType docs’ and ‘SecondType docs’ are nowhere to be found. What am I missing? Go version is 1.1.2.
(To run the above, save it into a main.go file, and go run main.go)
Great question!
Looking at the source code of go/doc, we can see that it has to deal with this same case in readType function. There, it says:
324 func (r *reader) readType(decl *ast.GenDecl, spec *ast.TypeSpec) {
...
334 // compute documentation
335 doc := spec.Doc
336 spec.Doc = nil // doc consumed - remove from AST
337 if doc == nil {
338 // no doc associated with the spec, use the declaration doc, if any
339 doc = decl.Doc
340 }
...
Notice in particular how it needs to deal with the case where the AST does not have a doc attached to the TypeSpec. To do this, it falls back on the GenDecl. This gives us a clue as to how we might use the AST directly to parse doc comments for structs. Adapting the for loop in the question code to add a case for *ast.GenDecl:
for _, f := range d {
ast.Inspect(f, func(n ast.Node) bool {
switch x := n.(type) {
case *ast.FuncDecl:
fmt.Printf("%s:\tFuncDecl %s\t%s\n", fset.Position(n.Pos()), x.Name, x.Doc.Text())
case *ast.TypeSpec:
fmt.Printf("%s:\tTypeSpec %s\t%s\n", fset.Position(n.Pos()), x.Name, x.Doc.Text())
case *ast.Field:
fmt.Printf("%s:\tField %s\t%s\n", fset.Position(n.Pos()), x.Names, x.Doc.Text())
case *ast.GenDecl:
fmt.Printf("%s:\tGenDecl %s\n", fset.Position(n.Pos()), x.Doc.Text())
}
return true
})
}
Running this gives us:
main.go:3:1: GenDecl %!s(*ast.CommentGroup=<nil>)
main.go:11:1: GenDecl &{[%!s(*ast.Comment=&{69 // FirstType docs})]}
main.go:11:6: TypeSpec FirstType %!s(*ast.CommentGroup=<nil>)
main.go:13:2: Field [FirstMember] &{[%!s(*ast.Comment=&{112 // FirstMember docs})]}
main.go:17:1: GenDecl &{[%!s(*ast.Comment=&{155 // SecondType docs})]}
main.go:17:6: TypeSpec SecondType %!s(*ast.CommentGroup=<nil>)
main.go:19:2: Field [SecondMember] &{[%!s(*ast.Comment=&{200 // SecondMember docs})]}
main.go:23:1: FuncDecl main &{[%!s(*ast.Comment=&{245 // Main docs})]}
main.go:33:23: Field [n] %!s(*ast.CommentGroup=<nil>)
main.go:33:35: Field [] %!s(*ast.CommentGroup=<nil>)
And, hey!
We've printed out the long-lost FirstType docs and SecondType docs! But this is unsatisfactory. Why is the doc not attached to the TypeSpec? The go/doc/reader.go file goes to extraordinary lengths to circumvent this issue, actually generating a fake GenDecl and passing it to the readType function mentioned earlier, if there is no documentation associated with the struct declaration!
503 fake := &ast.GenDecl{
504 Doc: d.Doc,
505 // don't use the existing TokPos because it
506 // will lead to the wrong selection range for
507 // the fake declaration if there are more
508 // than one type in the group (this affects
509 // src/cmd/godoc/godoc.go's posLink_urlFunc)
510 TokPos: s.Pos(),
511 Tok: token.TYPE,
512 Specs: []ast.Spec{s},
513 }
But why all this?
Imagine we changed the type definitions from code in the question slightly (defining structs like this is not common, but still valid Go):
// This documents FirstType and SecondType together
type (
// FirstType docs
FirstType struct {
// FirstMember docs
FirstMember string
}
// SecondType docs
SecondType struct {
// SecondMember docs
SecondMember string
}
)
Run the code (including the case for ast.GenDecl) and we get:
main.go:3:1: GenDecl %!s(*ast.CommentGroup=<nil>)
main.go:11:1: GenDecl &{[%!s(*ast.Comment=&{69 // This documents FirstType and SecondType together})]}
main.go:13:2: TypeSpec FirstType &{[%!s(*ast.Comment=&{129 // FirstType docs})]}
main.go:15:3: Field [FirstMember] &{[%!s(*ast.Comment=&{169 // FirstMember docs})]}
main.go:19:2: TypeSpec SecondType &{[%!s(*ast.Comment=&{215 // SecondType docs})]}
main.go:21:3: Field [SecondMember] &{[%!s(*ast.Comment=&{257 // SecondMember docs})]}
main.go:26:1: FuncDecl main &{[%!s(*ast.Comment=&{306 // Main docs})]}
main.go:36:23: Field [n] %!s(*ast.CommentGroup=<nil>)
main.go:36:35: Field [] %!s(*ast.CommentGroup=<nil>)
That's right
Now the struct type definitions have their docs, and the GenDecl has its own documentation, too. In the first case, posted in the question, the doc was attached to GenDecl, since the AST sees the individual struct type definitions of "contractions" of the parenthesized-version of type definitions, and wants to handle all definitions the same, whether they are grouped or not. The same thing would happen with variable definitions, as in:
// some general docs
var (
// v docs
v int
// v2 docs
v2 string
)
So if you wish to parse comments with pure AST, you need to be aware that this is how it works. But the preferred method, as #mjibson suggested, is to use go/doc. Good luck!
You need to use the go/doc package to extract documentation from the ast:
package main
import (
"fmt"
"go/doc"
"go/parser"
"go/token"
)
// FirstType docs
type FirstType struct {
// FirstMember docs
FirstMember string
}
// SecondType docs
type SecondType struct {
// SecondMember docs
SecondMember string
}
// Main docs
func main() {
fset := token.NewFileSet() // positions are relative to fset
d, err := parser.ParseDir(fset, "./", nil, parser.ParseComments)
if err != nil {
fmt.Println(err)
return
}
for k, f := range d {
fmt.Println("package", k)
p := doc.New(f, "./", 0)
for _, t := range p.Types {
fmt.Println(" type", t.Name)
fmt.Println(" docs:", t.Doc)
}
}
}
parse all structs with comment // typescript:interface
func TestStructDoc(t *testing.T) {
err := filepath.Walk(".",
func(path string, info os.FileInfo, err error) error {
if err != nil {
return err
}
if info.IsDir() {
fmt.Println(path, info.Size())
fset := token.NewFileSet()
d, err := parser.ParseDir(fset, path, nil, parser.ParseComments)
if err != nil {
t.Fatal(err)
}
for k, f := range d {
fmt.Println("package", k)
p := doc.New(f, "./", 0)
for _, t := range p.Types {
fmt.Println(" type", t.Name)
fmt.Println(" docs:", t.Doc)
if strings.HasPrefix(t.Doc, "typescript:interface") {
for _, spec := range t.Decl.Specs {
switch spec.(type) {
case *ast.TypeSpec:
typeSpec := spec.(*ast.TypeSpec)
fmt.Printf("Struct: name=%s\n", typeSpec.Name.Name)
switch typeSpec.Type.(type) {
case *ast.StructType:
structType := typeSpec.Type.(*ast.StructType)
for _, field := range structType.Fields.List {
i := field.Type.(*ast.Ident)
fieldType := i.Name
for _, name := range field.Names {
fmt.Printf("\tField: name=%s type=%s\n", name.Name, fieldType)
}
}
}
}
}
}
}
}
}
return nil
})
if err != nil {
t.Fatal(err)
}
}