I have an app that uploads via twitter api chunked upload, and it finally works with photos.
However, I am trying to get it to work with videos.
Uploading didnt work out-of-the-box even though the video format is mp4. The twitter guidelines for uploads are these
With that in mind, I have this ffmpeg command so far:
ffmpeg -i in.mp4 -vf \"scale=1280:720\" -b:v 5000K -minrate 5000K -maxrate 5000K -b:a 128K -r 30 -f mp4 -vcodec libx264 -profile:v high -pix_fmt yuv420p -strict -2 -ac 2 -acodec aac out.mp4
I still get "Not valid video", and I don't know why.
Here's my mediainfo output for out.mp4:
General
Count : 328
Count of stream of this kind : 1
Kind of stream : General
Kind of stream : General
Stream identifier : 0
Count of video streams : 1
Count of audio streams : 1
Video_Format_List : AVC
Video_Format_WithHint_List : AVC
Codecs Video : AVC
Video_Language_List : English
Audio_Format_List : AAC
Audio_Format_WithHint_List : AAC
Audio codecs : AAC LC
Audio_Language_List : English
Complete name : video-5e4405cd4348a5e4405cd434d2.mp4
File name : video-5e4405cd4348a5e4405cd434d2
File extension : mp4
Format : MPEG-4
Format : MPEG-4
Format/Extensions usually used : mov mp4 m4v m4a m4b m4p 3ga 3gpa 3gpp 3gp 3gpp2 3g2 k3g jpm jpx mqv ismv isma ismt f4a f4b f4v
Commercial name : MPEG-4
Format profile : Base Media
Internet media type : video/mp4
Codec ID : isom
Codec ID : isom (isom/iso2/avc1/mp41)
Codec ID/Url : http://www.apple.com/quicktime/download/standalone.html
CodecID_Compatible : isom/iso2/avc1/mp41
Codec : MPEG-4
Codec : MPEG-4
Codec/Extensions usually used : mov mp4 m4v m4a m4b m4p 3ga 3gpa 3gpp 3gp 3gpp2 3g2 k3g jpm jpx mqv ismv isma ismt f4a f4b f4v
File size : 52664272
File size : 50.2 MiB
File size : 50 MiB
File size : 50 MiB
File size : 50.2 MiB
File size : 50.22 MiB
Duration : 79744
Duration : 1 min 19 s
Duration : 1 min 19 s 744 ms
Duration : 1 min 19 s
Duration : 00:01:19.744
Duration : 00:01:19:20
Duration : 00:01:19.744 (00:01:19:20)
Overall bit rate : 5283334
Overall bit rate : 5 283 kb/s
Frame rate : 30.000
Frame rate : 30.000 FPS
Frame count : 2390
Stream size : 88780
Stream size : 86.7 KiB (0%)
Stream size : 87 KiB
Stream size : 87 KiB
Stream size : 86.7 KiB
Stream size : 86.70 KiB
Stream size : 86.7 KiB (0%)
Proportion of this stream : 0.00169
HeaderSize : 40
DataSize : 52575500
FooterSize : 88732
IsStreamable : No
File last modification date : UTC 2020-02-12 14:05:37
File last modification date (local) : 2020-02-12 15:05:37
Writing application : Lavf57.83.100
Writing application : Lavf57.83.100
Video
Count : 342
Count of stream of this kind : 1
Kind of stream : Video
Kind of stream : Video
Stream identifier : 0
StreamOrder : 0
ID : 1
ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format/Url : http://developers.videolan.org/x264.html
Commercial name : AVC
Format profile : High#L3.1
Format settings : CABAC / 4 Ref Frames
Format settings, CABAC : Yes
Format settings, CABAC : Yes
Format settings, ReFrames : 4
Format settings, ReFrames : 4 frames
Internet media type : video/H264
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Codec : AVC
Codec : AVC
Codec/Family : AVC
Codec/Info : Advanced Video Codec
Codec/Url : http://developers.videolan.org/x264.html
Codec/CC : avc1
Codec profile : High#L3.1
Codec settings : CABAC / 4 Ref Frames
Codec settings, CABAC : Yes
Codec_Settings_RefFrames : 4
Duration : 79667
Duration : 1 min 19 s
Duration : 1 min 19 s 667 ms
Duration : 1 min 19 s
Duration : 00:01:19.667
Duration : 00:01:19:20
Duration : 00:01:19.667 (00:01:19:20)
Bit rate : 5000000
Bit rate : 5 000 kb/s
Width : 1280
Width : 1 280 pixels
Height : 720
Height : 720 pixels
Sampled_Width : 1280
Sampled_Height : 720
Pixel aspect ratio : 1.000
Display aspect ratio : 1.778
Display aspect ratio : 16:9
Rotation : 0.000
Frame rate mode : CFR
Frame rate mode : Constant
FrameRate_Mode_Original : VFR
Frame rate : 30.000
Frame rate : 30.000 FPS
Frame count : 2390
Resolution : 8
Resolution : 8 bits
Colorimetry : 4:2:0
Color space : YUV
Chroma subsampling : 4:2:0
Chroma subsampling : 4:2:0
Bit depth : 8
Bit depth : 8 bits
Scan type : Progressive
Scan type : Progressive
Interlacement : PPF
Interlacement : Progressive
Bits/(Pixel*Frame) : 0.181
Stream size : 51297022
Stream size : 48.9 MiB (97%)
Stream size : 49 MiB
Stream size : 49 MiB
Stream size : 48.9 MiB
Stream size : 48.92 MiB
Stream size : 48.9 MiB (97%)
Proportion of this stream : 0.97404
Writing library : x264 - core 152 r2854 e9a5903
Writing library : x264 core 152 r2854 e9a5903
Encoded_Library_Name : x264
Encoded_Library_Version : core 152 r2854 e9a5903
Encoding settings : cabac=1 / ref=3 / deblock=1:0:0 / analyse=0x3:0x113 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=1 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=12 / lookahead_threads=2 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=3 / b_pyramid=2 / b_adapt=1 / b_bias=0 / direct=1 / weightb=1 / open_gop=0 / weightp=2 / keyint=250 / keyint_min=25 / scenecut=40 / intra_refresh=0 / rc_lookahead=40 / rc=abr / mbtree=1 / bitrate=5000 / ratetol=1.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ip_ratio=1.40 / aq=1:1.00
Language : en
Language : English
Language : English
Language : en
Language : eng
Language : en
Audio
Count : 275
Count of stream of this kind : 1
Kind of stream : Audio
Kind of stream : Audio
Stream identifier : 0
StreamOrder : 1
ID : 2
ID : 2
Format : AAC
Format/Info : Advanced Audio Codec
Commercial name : AAC
Format profile : LC
Format settings, SBR : No (Explicit)
Format settings, SBR : No (Explicit)
Codec ID : mp4a-40-2
Codec : AAC LC
Codec : AAC LC
Codec/Family : AAC
Codec/CC : 40
Duration : 79744
Duration : 1 min 19 s
Duration : 1 min 19 s 744 ms
Duration : 1 min 19 s
Duration : 00:01:19.744
Duration : 00:01:19:25
Duration : 00:01:19.744 (00:01:19:25)
Bit rate mode : CBR
Bit rate mode : Constant
Bit rate : 128257
Bit rate : 128 kb/s
Channel(s) : 2
Channel(s) : 2 channels
Channel positions : Front: L R
Channel positions : 2/0/0
ChannelLayout : L R
Samples per frame : 1024
Sampling rate : 48000
Sampling rate : 48.0 kHz
Samples count : 3827712
Frame rate : 46.875
Frame rate : 46.875 FPS (1024 SPF)
Frame count : 3738
Compression mode : Lossy
Compression mode : Lossy
Stream size : 1278470
Stream size : 1.22 MiB (2%)
Stream size : 1 MiB
Stream size : 1.2 MiB
Stream size : 1.22 MiB
Stream size : 1.219 MiB
Stream size : 1.22 MiB (2%)
Proportion of this stream : 0.02428
Language : en
Language : English
Language : English
Language : en
Language : eng
Language : en
Default : Yes
Default : Yes
Alternate group : 1
Alternate group : 1
Edit: Guess Ill add my PHP code too (This is chopped in a sensible manner because the actual file is very large):
// Set up Codebird
\Codebird\Codebird::setConsumerKey($consumer_key, $consumer_secret); // static, see README
$cb = \Codebird\Codebird::getInstance();
$cb->setToken($token, $token_secret);
$cb->setTimeout(60 * 1000); // 60 second request timeout
$video = new Video($path);
// Convert to the parameter required by twitter.
$converted = $video->convert();
$path = realpath('videos/' . $converted);
$file = fopen($path, 'rb');
$size = fstat($file)['size'];
$mime_type = mime_content_type($path);
$media = $cb->media_upload([
'command' => 'INIT',
'media_type' => $mime_type,
'media_category' => 'tweet_video',
'total_bytes' => $size,
]);
$success = $media->httpstatus >= 200 && $media->httpstatus < 300; # 2xx
if (!$success) {
throw new TwitterException("Failed to INIT upload for $path...");
}
// APPEND chunks to upload.
$mediaId = $media->media_id_string;
$segmentId = 0;
while (!feof($file)) {
echo "chunk #$segmentId....";
$chunk = fread($file, 512 * 1024); // caps out at 512 MB
echo "chunk size: ". strlen($chunk);
$media = $cb->media_upload([
'command' => 'APPEND',
'media_id' => $mediaId,
'segment_index' => $segmentId,
'media' => $chunk,
]);
$success = $media->httpstatus >= 200 && $media->httpstatus < 300; # 2xx
if (!$success) {
throw new TwitterException("Failed to APPEND to upload for $path, chunk $segmentId...");
}
$segmentId++;
}
// Close file and FINALIZE upload.
fclose($file);
echo "FINALIZING id $mediaId...";
$media = $cb->media_upload([
'command' => 'FINALIZE',
'media_id' => $mediaId,
]);
$success = $media->httpstatus >= 200 && $media->httpstatus < 300; # 2xx
if (!$success) {
var_dump($media);
throw new TwitterException("Failed to FINALIZE upload for $path...");
}
return $mediaId;
Video is my ffmpeg class, which I'll paste below, and $path is a URL leading to a perfectly valid mp4 video.
Video.php:
<?php
class VideoConversionException extends \Exception {}
class Video {
public $name;
public $converted;
public function __construct($name) {
self::clear();
$this->name = $name;
}
public function convert() {
$tmpVideo = 'video-' . uniqid() . uniqid() . '.mp4';
$videoPath = 'videos/' . $tmpVideo;
//$ffmpeg = "ffmpeg -i https://tvcanarias.acfipress.com/BC_190907_gc_teror.mp4 -vf "scale=1280:720" -b:v 5000K -b:a 128K -r 30 -f mp4 -vcodec libx264 -acodec aac output_video.mp4";
$ffmpeg = "ffmpeg -i {$this->name} -vf \"scale=1280:720\" -b:v 5000K -minrate 5000K -maxrate 5000K -b:a 128K -r 30 -f mp4 -vcodec libx264 -profile:v high -pix_fmt yuv420p -strict -2 -ac 2 -acodec aac $videoPath";
//$ffmpeg = "ffmpeg -i {$this->name} -pix_fmt yuv420p -vcodec libx264 -vf scale=640:-1 -acodec aac -vb 1024k -minrate 1024k -maxrate 1024k -bufsize 1024k -ar 44100 -ac 2 -strict experimental -r 30 $videoPath";
$output = [];
exec($ffmpeg, $output, $status);
if ($status != 0) {
//die("Couldnt run ffmpeg. (Error code: #$status)");
throw new VideoConversionException("Couldn't run ffmpeg. (Error code: #$status)");
}
$this->converted = $tmpVideo;
return $tmpVideo;
}
public function shredConverted() {
// delete video.
#unlink("videos/{$this->$converted}");
}
public static function clear() {
// We can't really shred videos right away as they might be still uploading.
// Therefore, every time this library is used, we will just delete videos older
// than, say, an hour.
$files = scandir('videos');
$curtime = time();
foreach ($files as $file) {
if ($file == '.gitignore' || $file == '.' || $file == '..') {
continue;
}
$mtime = filemtime("videos/$file");
$diff = $curtime - $mtime;
$overAnHour = $diff > (60 * 60);
if ($overAnHour) {
#unlink("videos/$file");
}
}
}
}
I have a simple Grails app, which I am trying to set upo to use ElasticSearch.
I have a single-node ElasticSearch instance running on EC2, which is running happily enough. (For reference, I just followed the steps here: http://www.elasticsearch.org/tutorials/elasticsearch-on-ec2/), but using 0.90.7 and the cloud-aws plugin version 1.15.0)
I am using the Grails ElasticSearch GORM plugin (http://grails.org/plugin/elasticsearch-gorm) (Master branch) and i'm connecting to ES using the transport client mode (elasticSearch.client.mode = 'transport')
Here's where it gets really odd...
The first time I boot up my app, it will happily index my Domain data on ES, I can query, etc, no problems.
If I then restart my grails app, it won't launch at all. I get
Message: Error creating bean with name 'searchableClassMappingConfigurator': Invocation of init method failed; nested exception is org.elasticsearch.transport.TransportSerializationException: Failed to deserialize exception response from stream
Line | Method
->> 262 | run in java.util.concurrent.FutureTask
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 1145 | runWorker in java.util.concurrent.ThreadPoolExecutor
| 615 | run . . . in java.util.concurrent.ThreadPoolExecutor$Worker
^ 724 | run in java.lang.Thread
Caused by TransportSerializationException: Failed to deserialize exception response from stream
->> 169 | handlerResponseError in org.elasticsearch.transport.netty.MessageChannelHandler
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 123 | messageReceived in ''
| 70 | handleUpstream in org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler
| 564 | sendUpstream in org.elasticsearch.common.netty.channel.DefaultChannelPipeline
| 791 | sendUpstream in org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext
| 296 | fireMessageReceived in org.elasticsearch.common.netty.channel.Channels
| 462 | unfoldAndFireMessageReceived in org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder
| 443 | callDecode in ''
| 310 | messageReceived in ''
| 70 | handleUpstream in org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler
| 564 | sendUpstream in org.elasticsearch.common.netty.channel.DefaultChannelPipeline
| 559 | sendUpstream in ''
| 268 | fireMessageReceived in org.elasticsearch.common.netty.channel.Channels
| 255 | fireMessageReceived in ''
| 88 | read . . in org.elasticsearch.common.netty.channel.socket.nio.NioWorker
| 108 | process in org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker
| 318 | run . . . in org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector
| 89 | run in org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker
| 178 | run . . . in org.elasticsearch.common.netty.channel.socket.nio.NioWorker
| 108 | run in org.elasticsearch.common.netty.util.ThreadRenamingRunnable
| 42 | run . . . in org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1
| 1145 | runWorker in java.util.concurrent.ThreadPoolExecutor
| 615 | run . . . in java.util.concurrent.ThreadPoolExecutor$Worker
^ 724 | run in java.lang.Thread
Caused by StreamCorruptedException: unexpected end of block data
->> 1370 | readObject0 in java.io.ObjectInputStream
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 1989 | defaultReadFields in ''
| 499 | defaultReadObject in ''
| 914 | readObject in java.lang.Throwable
| 1017 | invokeReadObject in java.io.ObjectStreamClass
| 1891 | readSerialData in java.io.ObjectInputStream
| 1796 | readOrdinaryObject in ''
| 1348 | readObject0 in ''
| 1989 | defaultReadFields in ''
| 499 | defaultReadObject in ''
| 914 | readObject in java.lang.Throwable
| 1017 | invokeReadObject in java.io.ObjectStreamClass
| 1891 | readSerialData in java.io.ObjectInputStream
| 1796 | readOrdinaryObject in ''
| 1348 | readObject0 in ''
| 370 | readObject in ''
| 167 | handlerResponseError in org.elasticsearch.transport.netty.MessageChannelHandler
| 123 | messageReceived in ''
| 70 | handleUpstream in org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler
| 564 | sendUpstream in org.elasticsearch.common.netty.channel.DefaultChannelPipeline
| 791 | sendUpstream in org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext
| 296 | fireMessageReceived in org.elasticsearch.common.netty.channel.Channels
| 462 | unfoldAndFireMessageReceived in org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder
| 443 | callDecode in ''
| 310 | messageReceived in ''
| 70 | handleUpstream in org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler
| 564 | sendUpstream in org.elasticsearch.common.netty.channel.DefaultChannelPipeline
| 559 | sendUpstream in ''
| 268 | fireMessageReceived in org.elasticsearch.common.netty.channel.Channels
| 255 | fireMessageReceived in ''
| 88 | read . . in org.elasticsearch.common.netty.channel.socket.nio.NioWorker
| 108 | process in org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker
| 318 | run . . . in org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector
| 89 | run in org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker
| 178 | run . . . in org.elasticsearch.common.netty.channel.socket.nio.NioWorker
| 108 | run in org.elasticsearch.common.netty.util.ThreadRenamingRunnable
| 42 | run . . . in org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1
| 1145 | runWorker in java.util.concurrent.ThreadPoolExecutor
| 615 | run . . . in java.util.concurrent.ThreadPoolExecutor$Worker
^ 724 | run in java.lang.Thread
This happens until I change the elasticSearch host details - ie, I can't boot my ap at all, with the original host details, ever again.
Both the ES node and my Grails app are both using elasticSearch 0.90.7, my config for the ES plugin looks like so:#
elasticSearch.client.mode = 'transport'
elasticSearch.client.hosts = [[host:'<my EC2 DNS>', port:9300]]
elasticSearch.datastoreImpl = 'mongoDatastore'
elasticSearch.client.transport.sniff = true
The only domain object I am marking as 'searchable' is mapped with mongoDB, which looks like so:
class CompletedApplicationFormSearchEntry {
static searchable = true
Long formId
Long jobId
Long employerId
Long jobseekerId
Date applicationDate
static mapWith = "mongo"
static constraints = {
}
}
If I remove the searchable attribute from the domain class, then relaunch the app, it launches fine, so I assume that there's something going on in the bootstrapping process when the domain object is detected as being searchable, but of course, it only causes an issue when the app's been restarted.
There are a handful of threads kicking about where people are seeing similar issue,s where they have nodes running different ES versions, different JVM version,s etc. But in this case, I only have one node!
I am absolutely tearing my hair out over this - I just can't work out what on earth's going wrong. I've tried different plugin versions, elasticsearch versions, 32-bit EC2 instance, 64bit EC2 instance - no luck!
Not with grails es plugin, but I fixed similar issue by fixing the jvm versions in ide ( used by the es client) to be the same as es master running on different jvm version.
STEP 0 : Make sure api and actual es version are identical.
STEP 1 : check jvm versions used by the es nodes
In following case, I had two different jvm versions, as pointed by "jvm" json key of both es nodes.
$ curl -XGET "http://localhost:9200/_nodes?jvm=true&pretty=true"
{
"cluster_name" : "elasticsearch",
"nodes" : {
"A6PDUvlWSN-zN2GKRxrSHA" : {
"name" : "Madeline Joyce",
"transport_address" : "inet[/192.168.1.4:9301]",
"host" : "prayagupd",
"ip" : "127.0.1.1",
"version" : "1.3.2",
"build" : "dee175d",
"http_address" : "inet[/192.168.1.4:9201]",
"settings" : {
"path" : {
"logs" : "/usr/local/elasticsearch-1.3.2/logs",
"home" : "/usr/local/elasticsearch-1.3.2"
},
"cluster" : {
"name" : "elasticsearch"
},
"http" : {
"port" : "9201"
},
"transport" : {
"tcp" : {
"port" : "9301"
}
},
"foreground" : "yes",
"name" : "Madeline Joyce"
},
"os" : {
"refresh_interval_in_millis" : 1000,
"available_processors" : 4,
"cpu" : {
"vendor" : "Intel",
"model" : "Core(TM) i5 CPU M 480 # 2.67GHz",
"mhz" : 2667,
"total_cores" : 4,
"total_sockets" : 4,
"cores_per_socket" : 16,
"cache_size_in_bytes" : 3072
},
"mem" : {
"total_in_bytes" : 3803283456
},
"swap" : {
"total_in_bytes" : 5998899200
}
},
"process" : {
"refresh_interval_in_millis" : 1000,
"id" : 9036,
"max_file_descriptors" : 4096,
"mlockall" : false
},
"jvm" : {
"pid" : 9036,
"version" : "1.7.0_65",
"vm_name" : "Java HotSpot(TM) 64-Bit Server VM",
"vm_version" : "24.65-b04",
"vm_vendor" : "Oracle Corporation",
"start_time_in_millis" : 1421578674811,
"mem" : {
"heap_init_in_bytes" : 268435456,
"heap_max_in_bytes" : 1038876672,
"non_heap_init_in_bytes" : 24313856,
"non_heap_max_in_bytes" : 136314880,
"direct_max_in_bytes" : 1038876672
},
"gc_collectors" : [ "ParNew", "ConcurrentMarkSweep" ],
"memory_pools" : [ "Code Cache", "Par Eden Space", "Par Survivor Space", "CMS Old Gen", "CMS Perm Gen" ]
},
"thread_pool" : {
"generic" : {
"type" : "cached",
"keep_alive" : "30s",
"queue_size" : -1
},
"index" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "200"
},
"snapshot_data" : {
"type" : "scaling",
"min" : 1,
"max" : 5,
"keep_alive" : "5m",
"queue_size" : -1
},
"bench" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"get" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "1k"
},
"snapshot" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"merge" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"suggest" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "1k"
},
"bulk" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "50"
},
"optimize" : {
"type" : "fixed",
"min" : 1,
"max" : 1,
"queue_size" : -1
},
"warmer" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"flush" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"search" : {
"type" : "fixed",
"min" : 12,
"max" : 12,
"queue_size" : "1k"
},
"percolate" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "1k"
},
"management" : {
"type" : "scaling",
"min" : 1,
"max" : 5,
"keep_alive" : "5m",
"queue_size" : -1
},
"refresh" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
}
},
"network" : {
"refresh_interval_in_millis" : 5000,
"primary_interface" : {
"address" : "192.168.1.4",
"name" : "eth0",
"mac_address" : "20:6A:8A:2A:24:E6"
}
},
"transport" : {
"bound_address" : "inet[/0:0:0:0:0:0:0:0%0:9301]",
"publish_address" : "inet[/192.168.1.4:9301]"
},
"http" : {
"bound_address" : "inet[/0:0:0:0:0:0:0:0%0:9201]",
"publish_address" : "inet[/192.168.1.4:9201]",
"max_content_length_in_bytes" : 104857600
},
"plugins" : [ ]
},
"TWNkkYYZSWe8NnrOPU57mQ" : {
"name" : "Scarlet Spider",
"transport_address" : "inet[/192.168.1.4:9300]",
"host" : "prayagupd",
"ip" : "127.0.1.1",
"version" : "1.3.2",
"build" : "dee175d",
"http_address" : "inet[/192.168.1.4:9200]",
"attributes" : {
"client" : "true",
"data" : "false"
},
"settings" : {
"path" : {
"data" : "/var/lib/elasticsearch",
"work" : "/tmp/elasticsearch",
"conf" : "/etc/elasticsearch",
"logs" : "/var/log/elasticsearch"
},
"cluster" : {
"name" : "elasticsearch"
},
"node" : {
"client" : "true"
},
"name" : "Scarlet Spider"
},
"os" : {
"refresh_interval_in_millis" : 1000,
"available_processors" : 4
},
"process" : {
"refresh_interval_in_millis" : 1000,
"id" : 11028,
"max_file_descriptors" : 4096,
"mlockall" : false
},
"jvm" : {
"pid" : 11028,
"version" : "1.7.0_05",
"vm_name" : "Java HotSpot(TM) 64-Bit Server VM",
"vm_version" : "23.1-b03",
"vm_vendor" : "Oracle Corporation",
"start_time_in_millis" : 1421580829189,
"mem" : {
"heap_init_in_bytes" : 59426304,
"heap_max_in_bytes" : 846331904,
"non_heap_init_in_bytes" : 24313856,
"non_heap_max_in_bytes" : 136314880,
"direct_max_in_bytes" : 846331904
},
"gc_collectors" : [ "PS Scavenge", "PS MarkSweep" ],
"memory_pools" : [ "Code Cache", "PS Eden Space", "PS Survivor Space", "PS Old Gen", "PS Perm Gen" ]
},
"thread_pool" : {
"generic" : {
"type" : "cached",
"keep_alive" : "30s",
"queue_size" : -1
},
"index" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "200"
},
"snapshot_data" : {
"type" : "scaling",
"min" : 1,
"max" : 5,
"keep_alive" : "5m",
"queue_size" : -1
},
"bench" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"get" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "1k"
},
"snapshot" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"merge" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"suggest" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "1k"
},
"bulk" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "50"
},
"optimize" : {
"type" : "fixed",
"min" : 1,
"max" : 1,
"queue_size" : -1
},
"warmer" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"flush" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
},
"search" : {
"type" : "fixed",
"min" : 12,
"max" : 12,
"queue_size" : "1k"
},
"percolate" : {
"type" : "fixed",
"min" : 4,
"max" : 4,
"queue_size" : "1k"
},
"management" : {
"type" : "scaling",
"min" : 1,
"max" : 5,
"keep_alive" : "5m",
"queue_size" : -1
},
"refresh" : {
"type" : "scaling",
"min" : 1,
"max" : 2,
"keep_alive" : "5m",
"queue_size" : -1
}
},
"network" : {
"refresh_interval_in_millis" : 5000
},
"transport" : {
"bound_address" : "inet[/0:0:0:0:0:0:0:0:9300]",
"publish_address" : "inet[/192.168.1.4:9300]"
},
"http" : {
"bound_address" : "inet[/0:0:0:0:0:0:0:0:9200]",
"publish_address" : "inet[/192.168.1.4:9200]",
"max_content_length_in_bytes" : 104857600
},
"plugins" : [ ]
}
}
}
STEP 2: update jvm version in ide (following shows for intellij ide)
Update to required jvm version, and add to project,
STEP 3 : Run both es nodes, the problem of TransportSerializationException should get fixed
$ curl -XGET "http://localhost:9200/_nodes?jvm=true&pretty=true"
{
"cluster_name": "elasticsearch",
"nodes": {
"GeRZFRiDSje8zLM_m90WRw": {
"name" : "Dougboy",
"jvm": {
"pid": 15223,
"version": "1.7.0_65",
"vm_name": "Java HotSpot(TM) 64-Bit Server VM",
"vm_version": "24.65-b04",
"vm_vendor": "Oracle Corporation",
"start_time_in_millis": 1421586819876,
"mem": {
"heap_init_in_bytes": 59426304,
"heap_max_in_bytes": 846200832,
"non_heap_init_in_bytes": 24576000,
"non_heap_max_in_bytes": 136314880,
"direct_max_in_bytes": 846200832
}
}
},
"A6PDUvlWSN-zN2GKRxrSHA": {
"name": "Madeline Joyce",
"jvm": {
"pid": 9036,
"version": "1.7.0_65",
"vm_name": "Java HotSpot(TM) 64-Bit Server VM",
"vm_version": "24.65-b04",
"vm_vendor": "Oracle Corporation",
"start_time_in_millis": 1421578674811,
"mem": {
"heap_init_in_bytes": 268435456,
"heap_max_in_bytes": 1038876672,
"non_heap_init_in_bytes": 24313856,
"non_heap_max_in_bytes": 136314880,
"direct_max_in_bytes": 1038876672
}
}
}
}
}
Reference
Java Client TransportSerializationException #3835, Oct 6, 2013
Looks like it was an issue with the plugin with Elasticsearch 0.90.7 throwing an exception which wasn't caught by the plugin.
Pull request here: https://github.com/mstein/elasticsearch-grails-plugin/pull/74 has a fix and includes ES 0.90.7