Go Anaconda twitter media upload with tweet - twitter

I can tweet, and upload media, but I cannot figure out how to tweet with the media using anaconda("github.com/ChimeraCoder/anaconda"). The media_id in the example was from a sucessfull media upload call.
mediaResponse, err := api.UploadMedia("R0lGODlhEAALALMMAOXp8a2503CHtOrt9L3G2+Dl7vL0+J6sy4yew1Jvp/T2+e/y9v///wAAAAAAAAAAACH/C05FVFNDQVBFMi4wAwEAAAAh+QQFCwAMACwAAAAAEAALAAAEK5DJSau91KxlpObepinKIi2kyaAlq7pnCq9p3NZ0aW/47H4dBjAEwhiPlAgAIfkECQsADAAsAAAAAAQACwAABA9QpCQRmhbflPnu4HdJVAQAIfkECQsADAAsAAAAABAACwAABDKQySlSEnOGc4JMCJJk0kEQxxeOpImqIsm4KQPG7VnfbEbDvcnPtpINebJNByiTVS6yCAAh+QQJCwAMACwAAAAAEAALAAAEPpDJSaVISVQWzglSgiAJUBSAdBDEEY5JMQyFyrqMSMq03b67WY2x+uVgvGERp4sJfUyYCQUFJjadj3WzuWQiACH5BAkLAAwALAAAAAAQAAsAAAQ9kMlJq73hnGDWMhJQFIB0EMSxKMoiFcNQmKjKugws0+navrEZ49S7AXfDmg+nExIPnU9oVEqmLpXMBouNAAAh+QQFCwAMACwAAAAAEAALAAAEM5DJSau91KxlpOYSUBTAoiiLZKJSMQzFmjJy+8bnXDMuvO89HIuWs8E+HQYyNAJgntBKBAAh+QQFFAAMACwMAAIABAAHAAAEDNCsJZWaFt+V+ZVUBAA7")
if err != nil {
fmt.Println(err)
}
//v := url.Values{}
//v.Set("media_ids", string(mediaResponse.MediaID))
fmt.Println(mediaResponse)
tweet := `
"media_ids": 612877656984416256,
"status": "hello"
`
result, err := api.PostTweet(tweet, nil)
if err != nil {
fmt.Println(err)
} else {
fmt.Println(result)
}
Can someone assist in telling me how to parse the json or call the PostTweet with the media id? I've also tried adding the media to url.Values without sucess.

Thanks everyone. I see that the json was invalid but the issue was an error passing the media_ids parameter. The response was: "errors":[{"code":44,"message":"media_ids parameter is invalid."}] which i though erroring out on the formatting but it had to do with not converting the media_ids type int64 to a string correctly. Here is the fixed code:
data, err := ioutil.ReadFile(fileName)
if err != nil {
fmt.Println(err)
}
mediaResponse, err := api.UploadMedia(base64.StdEncoding.EncodeToString(data))
if err != nil {
fmt.Println(err)
}
v := url.Values{}
v.Set("media_ids", strconv.FormatInt(mediaResponse.MediaID, 10))
result, err := api.PostTweet(posttitle, v)
if err != nil {
fmt.Println(err)
} else {
fmt.Println(result)
}

This is not valid json:
tweet := `
"media_ids": 612877656984416256,
"status": "hello"
`
Try using this to generate your json:
type Tweet struct {
MediaIds uint64 `json:"media_ids"`
Status string `json:"status"`
}
tweet := Tweet{612877656984416256, "hello"}
b, err := json.Marshal(tweet)
This results in :
{"media_ids":612877656984416256,"status":"hello"}
This has a few benefits over using a raw string.
It is more go centric. The struct can be passed around with values set and read with proper type checking caught at compile time.
The generated json string is more likely to be semantically correct. e.g. Go will also escape certain characters to help ensure they will be parsed properly by the receiver.

Related

Docker SDK for remote repository

I need to access a private docker registry using the Go SDK. I found "package registry".
I see it has "DefaultSession" object. I can connect to the private registry, but I can't investigate it using the DefaultSession.
Secondly, the registry package contains the Session struct. They wrote it's for the v1 protocol only. Ok, I connect the private repository using the Session:
c := http.Client{}
indexInfo, err := registry.ParseSearchIndexInfo("repo")
if err != nil {
log.Error(err)
return
}
endpoint, err := registry.NewV1Endpoint(indexInfo, "", nil)
if err != nil {
log.Error(err)
return
}
session, err := registry.NewSession(&c, &authConfig, endpoint)
if err != nil {
log.Error(err)
return
}
n, err := reference.ParseNamed("docker.io/repo/image")
if err != nil {
log.Error(err)
return
}
rep, err := session.GetRepositoryData(n)
if err != nil {
log.Error(err)
return
}
But GetRepositoryData returns zero images, but they are in the repository. Why?
Is it right way I do access to a remote repository? Is there a v2 SDK for Go?

Decode JSON from stream of data Docker GO SDK

I want to use Client.ContainerStats(ctx context.Context, containerID string, stream bool) method to get streaming stats of a container.
From what I understand, if I pass true to stream parameter, Docker will not close connection and periodically sends JSON containing stats of a container.
However, I don't know how decode JSON because I don't know where JSON data start and end.
What I'm using right now is that I don't use stream option and just fetch data periodically then decode it like this.
stats, err := dockerClient.ContainerStats(ctx, container.ContainerID, false)
msgBytes, _ := ioutil.ReadAll(stats.Body)
var containerStats ContainerStats
err = json.Unmarshal(msgBytes, &containerStats)
What I'm looking for is a function that block when I call it, then when it receives JSON data (I mean complete JSON data that can be decoded) it will return struct containing data that was decoded from JSON and then I can call that function again to get next stat without having to make a new request to Docker.
In your case, you have multiple options:
Map the result on a custom struct
Map the result on a map[string]interface{}
If you want to map by using a custom struct you can do something like this:
type myStruct struct {
Id string `json:"id"`
Read string `json:"read"`
Preread string `json:"preread"`
}
// perform actions to retrieve logs in stats
//...
var containerStats myStruct
json.NewDecoder(stats.Body).Decode(&containerStats)
fmt.Println(containerStats.Id)
With this solution, you have to decide which fields you want to map.
However, if you do not want to specify fields, you can perform something like this:
//Perform actions to retrieve logs in stats
//...
var containerStats map[string]interface{}
json.NewDecoder(stats.Body).Decode(&containerStats)
fmt.Println(containerStats["id"])
To conclude, if you have to manipulate your data, I recommend you to use the first solution by using custom structure.
EDITED: handle stream
By passing stream parameter to true, the docker api will return an io.ReadCloser which will be updated. Then, it's up to the caller to close the io.ReadCloser returned.
What you have to do is to perdiodically read the buffer value.
type myStruct struct {
Id string `json:"id"`
Read string `json:"read"`
Preread string `json:"preread"`
CpuStats cpu `json:"cpu_stats"`
}
type cpu struct {
Usage cpuUsage `json:"cpu_usage"`
}
type cpuUsage struct {
Total float64 `json:"total_usage"`
}
func main() {
ctx, cancel := context.WithTimeout(context.Background(), 5 * time.Second)
cli, e := client.NewEnvClient()
if e != nil {
panic(e)
}
stats, e := cli.ContainerStats(ctx, "container_id", true)
if e != nil {
fmt.Errorf("%s", e.Error())
}
decoder := json.NewDecoder(stats.Body)
var containerStats myStruct
for {
select {
case <-ctx.Done():
stats.Body.Close()
fmt.Println("Stop logging")
return
default:
if err := decoder.Decode(&containerStats); err == io.EOF {
return
} else if err != nil {
cancel()
}
fmt.Println(containerStats.CpuStats.Usage.Total)
}
}
}
In this example, we are decoding the stats.Body ReadCloser when new data arrives, printing the total cpu usage, and closing the stream after 5 seconds.

golang parse POST request

I have a HTTP POST request with payload
indices=0%2C1%2C2
Here is my golang backend code
err := r.ParseForm()
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
log.Println("r.PostForm", r.PostForm)
log.Println("r.Form", r.Form)
body, err := ioutil.ReadAll(r.Body)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
log.Println("r.Body", string(body))
values, err := url.ParseQuery(string(body))
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
log.Println("indices from body", values.Get("indices"))
Output:
r.PostForm map[]
r.Form map[]
r.Body indices=0%2C1%2C2
indices from body 0,1,2
Why is it that the POST request is not parsed by r.ParseForm(), while manaully parsing it with url.ParseQuery(string(body)) gives the correct result?
The problem is not in your server code which is fine, but simply that your client, whatever it is, is missing the correct Content-Type header for POST forms. Simply set the header to
Content-Type: application/x-www-form-urlencoded
In your client.
Get value form your params use PostFormValue("params") from your http.Request
err := r.ParseForm()
if err != nil{
panic(err)
}
params := r.PostFormValue("params") // to get params value with key

Using jwt-go Library - Key is invalid or invalid type

I am trying to pass in a token to the "Parse(token String, keyFunc Keyfunc)" GO routine defined in this GO-library (http://godoc.org/github.com/dgrijalva/jwt-go) for JWT-token parsing/validation.
When I pass the token to this function -
token, err := jwt.Parse(getToken, func(token *jwt.Token) (interface{}, error) {
return config.Config.Key, nil
})
I get an error which says "Key is invalid or invalid type".
My config struct looks like this in config.go file -
config struct {
Key string
}
Any suggestions to solve this problem? The token I am passing is a JWT token.
config struct {
Key string
}
Key needs to be a []byte
I am not sure if this can be an issue for someone else.
My problem was I was using Signing Method "SigningMethodES256" but "SigningMethodHS256" or Any with SigningMethodHS* works fine.
If someone knows why this is an issue please answer.
Other way is to do something like this -
token, err := jwt.Parse(getToken, func(token *jwt.Token) (interface{}, error) {
return []byte(config.Config.Key), nil
})
The whole idea being that the Parse function returns a slice of bytes.
Taking a look at the function signatures in the GoDoc for github.com/dgrijalva/jwt-go we see:
func Parse(tokenString string, keyFunc Keyfunc) (*Token, error)
type Keyfunc func(*Token) (interface{}, error)
Keyfunc requires you to return (interface{}, error). Given the mysterious interface{} type, you might expect to be fine returning a string; however, a peek under the hood reveals that Parse() tries to Verify(), which attempts the following type assertion with your interface{} value as the key:
keyBytes, ok := key.([]byte)
That will succeed for []byte types, but will fail for string types. When that fails, the result is the error message you are getting. Read more about type assertions in the Effective Go documentation to learn why it fails.
Example: https://play.golang.org/p/9KKNFLLQrm
package main
import "fmt"
func main() {
var a interface{}
var b interface{}
a = []byte("hello")
b = "hello"
key, ok := a.([]byte)
if !ok {
fmt.Println("a is an invalid type")
} else {
fmt.Println(key)
}
key, ok = b.([]byte)
if !ok {
fmt.Println("b is an invalid type")
} else {
fmt.Println(key)
}
}
[104 101 108 108 111]
b is an invalid type
This is working for me.
token.SignedString([]byte("mysecretkey"))
func GenerateJWT(email string, username string) (tokenString string, err error) {
expirationTime := time.Now().Add(1 * time.Hour)
claims := &JWTClime{
Email: email,
Username: username,
StandardClaims: jwt.StandardClaims{
ExpiresAt: expirationTime.Unix(),
},
}
token := jwt.NewWithClaims(jwt.SigningMethodHS256, claims)
tokenString, err = token.SignedString([]byte("mysecretkey"))
return
}

Error when fetching URL through proxy in Go

This is related to this other question. I'm fetching a URL through a proxy using this simple code:
package main
import (
"fmt"
"io/ioutil"
"net/http"
"net/url"
)
func main() {
proxyUrl, err := url.Parse("87.236.233.92:8080")
httpClient := &http.Client { Transport: &http.Transport { Proxy: http.ProxyURL(proxyUrl) } }
response, err := httpClient.Get("http://stackoverflow.com")
if err != nil {
fmt.Println(err.Error())
} else {
body, _ := ioutil.ReadAll(response.Body)
fmt.Println("OK: ", len(body))
}
}
If I run this code, I am getting this error:
Get http://stackoverflow.com: http: error connecting to proxy 87.236.233.92:8080: GetServByName: The requested name is valid, but no data of the requested type was found.
I know that the proxy address is valid and if I fetch the URL through the proxy by other means it work. Any idea why I'm getting this error?
Specify your proxy with http:// in and it should work, eg
proxyUrl, err := url.Parse("http://87.236.233.92:8080")
if err != nil {
fmt.Println("Bad proxy URL", err)
return
}

Resources