I am trying to find the calibration from
I was following the type of inputs described here, but i am having this error:
error: OpenCV(4.4.0) /tmp/pip-req-build-6amqbhlx/opencv/modules/calib3d/src/solvepnp.cpp:753: error: (-215:Assertion failed) ( (npoints >= 4) || (npoints == 3 && flags == SOLVEPNP_ITERATIVE && useExtrinsicGuess) ) && npoints == std::max(ipoints.checkVector(2, CV_32F), ipoints.checkVector(2, CV_64F)) in function 'solvePnPGeneric'
Example code:
Any hints where this error might be coming from?
Related
I have a dataset of memes' URLs which I wanna extract their texts from them. I have this function:
def image2text(path_x):
with requests.get(path_x, stream=True) as r:
request_x=r.content
r.close()
img=Image.open(BytesIO(request_x))
bilat_x=cv2.bilateralFilter(np.array(img),5, 55,60)
img.close()
request_x=None
cv2_x=cv2.cvtColor(bilat_x, cv2.COLOR_BGR2GRAY)
_,img = cv2.threshold(cv2_x, 240, 255, 1)
meme_text=pytesseract.image_to_string(img, lang='eng')
return meme_text
image2text('https://i.redd.it/r9lw184zky881.png')
I receive the following error:
error: OpenCV(4.1.2)
/io/opencv/modules/imgproc/src/bilateral_filter.dispatch.cpp:166:
error: (-215:Assertion failed) (src.type() == CV_8UC1 || src.type() ==
CV_8UC3) && src.data != dst.data in function 'bilateralFilter_8u'
BTW, the code is mainly copied from this source
I fixed it by adding the following line after opening the image:
img = cv2.cvtColor(np.array(img), cv2.COLOR_BGRA2BGR)
I have an oData service for File Download functionality. I have to include $value parameter to oData call to trigger GET_Stream at backend.
I see with an external breakpoint on get_stream method, that the method is triggered. But the file download doesn't work. I get a HTTP response 200 which means it is all OK but I don't see any data or row of information. The response in /IWFND/GW_CLIENT is empty.
So I debug and see that at the end of the get_stream method there is a parameter changing with these lines:
COPY_DATA_TO_REF(
EXPORTING IS_DATA = LS_STREAM
CHANGING CR_DATA = ER_STREAM)
In ls_stream there should be two columns (column1 and column2).
One of the columns is set and the other is empty. No matter what value I give to column2, it is still empty. Maybe I don't get any information at /IWFND/GW_CLIENT request because the parameter column2 is empty? Can you give me a suggestion?
My redefined methods are:
_DPC_EXT:
/IWBEP/IF_MGW_APPL_SRV_RUNTIME~GET_STREAM
ATTACHMENTSET_GET_ENTITYSET
_MPC_EXT:
DEFINE
I'm not sure what are these two components column1 and column2 you are referring to...
To return a PDF, you must define the entity type as being "Media" (checkbox), and transfer the PDF through the type /iwbep/if_mgw_appl_types=>ty_s_media_resource (components value of type XSTRING and mime_type).
More information in SAP Library > SAP Gateway Foundation Developer Guide > Media Links.
Example:
OData service ZGETPDF, with entity type File of type "Media" (and entity set FileSet)
URL to query the PDF: https://server.company.com:44322/sap/opu/odata/sap/ZGETPDF_SRV/FileSet('notused.pdf')/$value
ZCL_ZGETPDF_DPC_EXT class:
GET_STREAM method:
METHOD /iwbep/if_mgw_appl_srv_runtime~get_stream.
DATA: ls_stream TYPE /iwbep/if_mgw_appl_types=>ty_s_media_resource.
ls_stream-value = get_dummy_pdf( ).
ls_stream-mime_type = 'application/pdf'.
copy_data_to_ref( EXPORTING is_data = ls_stream
CHANGING cr_data = er_stream ).
ENDMETHOD.
Code to get a dummy PDF (any PDF, just for demo):
METHODS get_dummy_pdf RETURNING VALUE(result) TYPE xstring.
...
METHOD get_dummy_pdf.
" PDF from http://www.tagg.org/pdftest.pdf University of Liverpool
DATA(base64_string) =
'JVBERi0xLjIgDQol4uPP0w0KIA0KOSAwIG9iag0KPDwNCi9MZW5ndGggMTAgMCBSDQovRmlsdGVyIC9GbGF0ZURlY29kZSANCj4+'
&& 'DQpzdHJlYW0NCkiJzZDRSsMwFIafIO/we6eyZuckTZPtbtIWBi0UjYKQGxFbJmpliuLb26QM8X6CJBfJyf99ycmFF6xJagWrrMxz'
&& 'wJeCEMd+gFjWBC1dLPeCJFkbl/fTKfwnTqt1CK0xIZyEwFYZ2T+fwT8KnmIxUmJinNKJyUiyW7mZVEQ6I54m2K3ZzFiupvgPaee7'
&& 'JHFuZqyDvxuGBbZdu8D1y+7jYf+2e//C2KOJm9dxfEqqTHMRXZlR0hRJuKwZau6EJa+MOdjpYN/gprq8xVW7aRp0ZY162ySbktoW'
&& 'vxpPZULGxJLSr+G4UuX+QHrcl/rz/2eqvPgGPPWhqg0KZW5kc3RyZWFtDQplbmRvYmoNCjEwIDAgb2JqDQoyNDYNCmVuZG9iag0K'
&& 'NCAwIG9iag0KPDwNCi9UeXBlIC9QYWdlDQovUGFyZW50IDUgMCBSDQovUmVzb3VyY2VzIDw8DQovRm9udCA8PA0KL0YwIDYgMCBS'
&& 'IA0KL0YxIDcgMCBSIA0KPj4NCi9Qcm9jU2V0IDIgMCBSDQo+Pg0KL0NvbnRlbnRzIDkgMCBSDQo+Pg0KZW5kb2JqDQo2IDAgb2Jq'
&& 'DQo8PA0KL1R5cGUgL0ZvbnQNCi9TdWJ0eXBlIC9UcnVlVHlwZQ0KL05hbWUgL0YwDQovQmFzZUZvbnQgL0FyaWFsDQovRW5jb2Rp'
&& 'bmcgL1dpbkFuc2lFbmNvZGluZw0KPj4NCmVuZG9iag0KNyAwIG9iag0KPDwNCi9UeXBlIC9Gb250DQovU3VidHlwZSAvVHJ1ZVR5'
&& 'cGUNCi9OYW1lIC9GMQ0KL0Jhc2VGb250IC9Cb29rQW50aXF1YSxCb2xkDQovRmlyc3RDaGFyIDMxDQovTGFzdENoYXIgMjU1DQov'
&& 'V2lkdGhzIFsgNzUwIDI1MCAyNzggNDAyIDYwNiA1MDAgODg5IDgzMyAyMjcgMzMzIDMzMyA0NDQgNjA2IDI1MCAzMzMgMjUwIA0K'
&& 'Mjk2IDUwMCA1MDAgNTAwIDUwMCA1MDAgNTAwIDUwMCA1MDAgNTAwIDUwMCAyNTAgMjUwIDYwNiA2MDYgNjA2IA0KNDQ0IDc0NyA3'
&& 'NzggNjY3IDcyMiA4MzMgNjExIDU1NiA4MzMgODMzIDM4OSAzODkgNzc4IDYxMSAxMDAwIDgzMyANCjgzMyA2MTEgODMzIDcyMiA2'
&& 'MTEgNjY3IDc3OCA3NzggMTAwMCA2NjcgNjY3IDY2NyAzMzMgNjA2IDMzMyA2MDYgDQo1MDAgMzMzIDUwMCA2MTEgNDQ0IDYxMSA1'
&& 'MDAgMzg5IDU1NiA2MTEgMzMzIDMzMyA2MTEgMzMzIDg4OSA2MTEgDQo1NTYgNjExIDYxMSAzODkgNDQ0IDMzMyA2MTEgNTU2IDgz'
&& 'MyA1MDAgNTU2IDUwMCAzMTAgNjA2IDMxMCA2MDYgDQo3NTAgNTAwIDc1MCAzMzMgNTAwIDUwMCAxMDAwIDUwMCA1MDAgMzMzIDEw'
&& 'MDAgNjExIDM4OSAxMDAwIDc1MCA3NTAgDQo3NTAgNzUwIDI3OCAyNzggNTAwIDUwMCA2MDYgNTAwIDEwMDAgMzMzIDk5OCA0NDQg'
&& 'Mzg5IDgzMyA3NTAgNzUwIA0KNjY3IDI1MCAyNzggNTAwIDUwMCA2MDYgNTAwIDYwNiA1MDAgMzMzIDc0NyA0MzggNTAwIDYwNiAz'
&& 'MzMgNzQ3IA0KNTAwIDQwMCA1NDkgMzYxIDM2MSAzMzMgNTc2IDY0MSAyNTAgMzMzIDM2MSA0ODggNTAwIDg4OSA4OTAgODg5IA0K'
&& 'NDQ0IDc3OCA3NzggNzc4IDc3OCA3NzggNzc4IDEwMDAgNzIyIDYxMSA2MTEgNjExIDYxMSAzODkgMzg5IDM4OSANCjM4OSA4MzMg'
&& 'ODMzIDgzMyA4MzMgODMzIDgzMyA4MzMgNjA2IDgzMyA3NzggNzc4IDc3OCA3NzggNjY3IDYxMSANCjYxMSA1MDAgNTAwIDUwMCA1'
&& 'MDAgNTAwIDUwMCA3NzggNDQ0IDUwMCA1MDAgNTAwIDUwMCAzMzMgMzMzIDMzMyANCjMzMyA1NTYgNjExIDU1NiA1NTYgNTU2IDU1'
&& 'NiA1NTYgNTQ5IDU1NiA2MTEgNjExIDYxMSA2MTEgNTU2IDYxMSANCjU1NiBdDQovRW5jb2RpbmcgL1dpbkFuc2lFbmNvZGluZw0K'
&& 'L0ZvbnREZXNjcmlwdG9yIDggMCBSDQo+Pg0KZW5kb2JqDQo4IDAgb2JqDQo8PA0KL1R5cGUgL0ZvbnREZXNjcmlwdG9yDQovRm9u'
&& 'dE5hbWUgL0Jvb2tBbnRpcXVhLEJvbGQNCi9GbGFncyAxNjQxOA0KL0ZvbnRCQm94IFsgLTI1MCAtMjYwIDEyMzYgOTMwIF0NCi9N'
&& 'aXNzaW5nV2lkdGggNzUwDQovU3RlbVYgMTQ2DQovU3RlbUggMTQ2DQovSXRhbGljQW5nbGUgMA0KL0NhcEhlaWdodCA5MzANCi9Y'
&& 'SGVpZ2h0IDY1MQ0KL0FzY2VudCA5MzANCi9EZXNjZW50IDI2MA0KL0xlYWRpbmcgMjEwDQovTWF4V2lkdGggMTAzMA0KL0F2Z1dp'
&& 'ZHRoIDQ2MA0KPj4NCmVuZG9iag0KMiAwIG9iag0KWyAvUERGIC9UZXh0ICBdDQplbmRvYmoNCjUgMCBvYmoNCjw8DQovS2lkcyBb'
&& 'NCAwIFIgXQ0KL0NvdW50IDENCi9UeXBlIC9QYWdlcw0KL01lZGlhQm94IFsgMCAwIDYxMiA3OTIgXQ0KPj4NCmVuZG9iag0KMSAw'
&& 'IG9iag0KPDwNCi9DcmVhdG9yICgxNzI1LmZtKQ0KL0NyZWF0aW9uRGF0ZSAoMS1KYW4tMyAxODoxNVBNKQ0KL1RpdGxlICgxNzI1'
&& 'LlBERikNCi9BdXRob3IgKFVua25vd24pDQovUHJvZHVjZXIgKEFjcm9iYXQgUERGV3JpdGVyIDMuMDIgZm9yIFdpbmRvd3MpDQov'
&& 'S2V5d29yZHMgKCkNCi9TdWJqZWN0ICgpDQo+Pg0KZW5kb2JqDQozIDAgb2JqDQo8PA0KL1BhZ2VzIDUgMCBSDQovVHlwZSAvQ2F0'
&& 'YWxvZw0KL0RlZmF1bHRHcmF5IDExIDAgUg0KL0RlZmF1bHRSR0IgIDEyIDAgUg0KPj4NCmVuZG9iag0KMTEgMCBvYmoNClsvQ2Fs'
&& 'R3JheQ0KPDwNCi9XaGl0ZVBvaW50IFswLjk1MDUgMSAxLjA4OTEgXQ0KL0dhbW1hIDAuMjQ2OCANCj4+DQpdDQplbmRvYmoNCjEy'
&& 'IDAgb2JqDQpbL0NhbFJHQg0KPDwNCi9XaGl0ZVBvaW50IFswLjk1MDUgMSAxLjA4OTEgXQ0KL0dhbW1hIFswLjI0NjggMC4yNDY4'
&& 'IDAuMjQ2OCBdDQovTWF0cml4IFswLjQzNjEgMC4yMjI1IDAuMDEzOSAwLjM4NTEgMC43MTY5IDAuMDk3MSAwLjE0MzEgMC4wNjA2'
&& 'IDAuNzE0MSBdDQo+Pg0KXQ0KZW5kb2JqDQp4cmVmDQowIDEzDQowMDAwMDAwMDAwIDY1NTM1IGYNCjAwMDAwMDIxNzIgMDAwMDAg'
&& 'bg0KMDAwMDAwMjA0NiAwMDAwMCBuDQowMDAwMDAyMzYzIDAwMDAwIG4NCjAwMDAwMDAzNzUgMDAwMDAgbg0KMDAwMDAwMjA4MCAw'
&& 'MDAwMCBuDQowMDAwMDAwNTE4IDAwMDAwIG4NCjAwMDAwMDA2MzMgMDAwMDAgbg0KMDAwMDAwMTc2MCAwMDAwMCBuDQowMDAwMDAw'
&& 'MDIxIDAwMDAwIG4NCjAwMDAwMDAzNTIgMDAwMDAgbg0KMDAwMDAwMjQ2MCAwMDAwMCBuDQowMDAwMDAyNTQ4IDAwMDAwIG4NCnRy'
&& 'YWlsZXINCjw8DQovU2l6ZSAxMw0KL1Jvb3QgMyAwIFINCi9JbmZvIDEgMCBSDQovSUQgWzw0NzE0OTUxMDQzM2RkNDg4MmYwNWY4'
&& 'YzEyNDIyMzczND48NDcxNDk1MTA0MzNkZDQ4ODJmMDVmOGMxMjQyMjM3MzQ+XQ0KPj4NCnN0YXJ0eHJlZg0KMjcyNg0KJSVFT0YN'
&& 'CgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
&& 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
&& 'AAAAAAAAAAAAAAAAAAAA'.
CALL TRANSFORMATION ID SOURCE x = base64_string RESULT x = result.
ENDMETHOD.
Question also discussed there.
I'm trying to find the largest blob in an image and classify it according to a linked plist file. I'm using the latest version of OpenCV for iOS, and I've looked at several related questions, but none so far relate to iOS.
I'm getting this error:
OpenCV Error: Assertion failed (type == src2.type() && src1.cols == src2.cols && (type == CV_32F || type == CV_8U)) in batchDistance, file /Users/admin/Desktop/OpenCV/modules/core/src/stat.cpp, line 4000
libc++abi.dylib: terminating with uncaught exception of type cv::Exception: /Users/admin/Desktop/OpenCV/modules/core/src/stat.cpp:4000: error: (-215) type == src2.type() && src1.cols == src2.cols && (type == CV_32F || type == CV_8U) in function batchDistance
when I run this:
- (IBAction)CaptureButton:(id)sender
{
// Find the biggest blob.
int biggestBlobIndex = 0;
for (int i = 0, biggestBlobArea = 0; i < detectedBlobs.size(); i++)
{
Blob &detectedBlob = detectedBlobs[i];
int blobArea = detectedBlob.getWidth() * detectedBlob.getHeight();
if (blobArea > biggestBlobArea)
{
biggestBlobIndex = i;
biggestBlobArea = blobArea;
}
}
Blob &biggestBlob = detectedBlobs[biggestBlobIndex];
// Classify the blob.
blobClassifier->classify(biggestBlob); // the error occurs here
}
The classify that I'm calling in the last line there was declared in another file:
void classify(Blob &detectedBlob) const;
This is the relevant code from stat.cpp:
Mat src1 = _src1.getMat(), src2 = _src2.getMat(), mask = _mask.getMat();
int type = src1.type();
CV_Assert( type == src2.type() && src1.cols == src2.cols &&
(type == CV_32F || type == CV_8U)); // this is line 4000
What's the issue here?
I don't know how do cv::Mat objects look in objective c, but You need to make sure that all the dimensions, channel number and depth of images used with the classifier are uniform. Probably there was a step previously when You fed the classifier with training images. Maybe one of them is not compatible with the mat that You are trying to classify.
You can try debugging with opencv if You compile it Yourself and set debug version in CMake.
I am new to swift and trying to solve a very basic Logical AND problem
if (textField == self.cvv && cvv.text.length == 4 && !string.isEmpty)
{
return false;
}
this is my code
According to this
https://developer.apple.com/library/ios/documentation/Swift/Conceptual/Swift_Programming_Language/BasicOperators.html
the && does exists however I am getting error
Couldn't find an overload for the &&
What and how can I use logical operations?
It is not a logical && error: compiler confused by an earlier error:
'String' does not have a member named 'length'
count(cvv.text) is not available either:
See #RMenke's answer for Swift 2.0 syntax.
I just tried this:
var textField = UITextField()
var cvv = UITextField()
var string = ""
if (textField == cvv && cvv.text!.characters.count == 4 && string.isEmpty)
{
return false;
}
I couldn't replicate the error, because I have no idea about the types and declarations of all instances involved. However I got an error on the text.length might be a Swift 1.2 => swift 2.0 change. I updated it to text.characters.count
To more fully answer your question...
&& always worked fine for me, exactly the way you use it.
conditon1 operatorA condition2 && condition3 operatorB conditon4 ...
However the "couldn't find an overload" error is often the result of a type mismatch. Checking an Int against a String for example.
I'm already looking for hours but I can't find the problem.
I get the following error when I want to stitch two images together:
OopenCV error: assertion failed (y==0 || data && dims >=1 && (unsigned)y < (unsigned > size.p[0])) in unkown function...
This is the code (pano.jpg was already stitched together in a previous run of the algorithm were the same algorithm did work...):
cv::Mat img1 = imread("input2.jpg");
cv::Mat img2 = imread("pano.jpg");
std::vector<cv::Mat> vectest;
vectest.push_back(img2);
vectest.push_back(img1);
cv::Mat result;
cv::Stitcher stitcher = cv::Stitcher::createDefault( false );
stitcher.setPanoConfidenceThresh(0.01);
detail::BestOf2NearestMatcher *matcher = new detail::BestOf2NearestMatcher(false, 0.001/*=match_conf*/);
detail::SurfFeaturesFinder *featureFinder = new detail::SurfFeaturesFinder(100);
stitcher.setFeaturesMatcher(matcher);
stitcher.setFeaturesFinder(featureFinder);
cv::Stitcher::Status status = stitcher.stitch( vectest, result );
You can find the images here:
pano.jpg: https://dl.dropbox.com/u/5276376/pano.jpg
input2.jpg: https://dl.dropbox.com/u/5276376/input2.jpg
Edit:
I compiled opencv 2.4.2 myself but still the same problem...
The system crashes in the stitcher.cpp file on the following line:
blender_->feed(img_warped_s, mask_warped, corners[img_idx]);
In this feed function it crashed at this line:
int y_ = y - y_tl;
const Point3_<short>* src_row = src_pyr_laplace[i].ptr<Point3_<short> >(y_);
Point3_<short>* dst_row = dst_pyr_laplace_[i].ptr<Point3_<short> >(y);
And finally this assertion in mat.hpp:
template<typename _Tp> inline _Tp* Mat::ptr(int y)
{
CV_DbgAssert( y == 0 || (data && dims >= 1 && (unsigned)y < (unsigned)size.p[0]) );
return (_Tp*)(data + step.p[0]*y);
}
strange that everything works fine for some people here...
I stitching images now,but not using stitch High Level Functionality instead encode every step by opencv2.4.2. As far as I know, you could have a try about first SurfFeaturesFinder, second BestOf2NearestMatcher. Just a try, Good luck!