open cash drawer with TM-T81 epson iOS SDK - ios

I want to open a cash drawer which I have bought its printer driven cash drawer. And I have an Epson TM-T81 receipt printer.
I get a delegate called when I open and close the printer manually but I want to open it automatically when receipt is printed.
The code I have written is :
-(void)openDrawer{
EposBuilder *builder = [[EposBuilder alloc] initWithPrinterModel:#"TM-P20" Lang:0];
if(builder == nil){
return ;
}
//add command
int result;
result = [builder addPulse:EPOS_OC_DRAWER_1 Time:EPOS_OC_PULSE_100];
NSLog(#"%d command result",result);
NSString *str = #"27 112 48 55 121";
NSData *data = [str dataUsingEncoding:NSUTF8StringEncoding];
result = [builder addCommand:data];
NSLog(#"%d pulse",result);
if(result != EPOS_OC_SUCCESS){
NSLog(#"cut failed");
return ;
}
//send builder data
unsigned long status = 0;
unsigned long battery = 0;
result = [printer sendData:builder Timeout:10000 Status:&status Battery:&battery];
//remove builder
[builder clearCommandBuffer];
}
Looking for a solution from experts like you.

Checkout the documentation.
http://spsrprofessionals.com/ClientSite/readers/ePOS-Print_SDK_141020E/iOS/ePOS-Print_SDK_iOS_en_revN.pdf#page98
You're looking for the addPulse method of the builder.
The only need you need to know is what jack the cash drawer is connected to if you printer has more than one cash drawer jack.

I added below the line of code and got working for the TM-T20 series.
[printer_ addPulse:EPOS2_DRAWER_HIGH time:EPOS2_PULSE_100];

Related

Having trouble receiving device-to-device payments using NFCISOTag7816 on iOS

I do the same transactions for both card payment and device payment.
When <NFCMifareTag: 0x280d12cc0> is received for card payment, after hexString operation, this incoming tag is transmitted to the service as 045c7d8a076b80, the tagId on my card. When I try to get payment by scanning the card again, the current tag is changed. But after the hexString operation, I still get the same tagId. For example; 045c7d8a076b80 again for <NFCMifareTag: 0x280da9fc0>.
But when I scan the device for device payment, the tagId I get after hexString operation for <NFCISO7816Tag: 0x2829d18c0> is 0803ffc6. Every time I read it, the current tag changes like on the card, but the tagId changes every time for the device. For example; For <NFCISO7816Tag: 0x282849c80> I get the id 082c58cc.
The codes I use;
id<NFCISO7816Tag> currentTag = [[tags firstObject] asNFCISO7816Tag];
dispatch_async(dispatch_get_main_queue(), ^{
NSString * tagId = [[currentTag identifier] hexString];
[self.delegate nfcManagerDidReadTag:tagId];
});
- (NSString *)hexString {
const unsigned char *bytes = (const unsigned char *)self.bytes;
NSMutableString *hex = [NSMutableString new];
for (NSInteger i = 0; i < self.length; i++) {
[hex appendFormat:#"%02x", bytes[i]];
}
return [hex copy];
}
Is there anything else I need to do for the ISO7816 Tag? Is it available for apple to use the tag I read from the device? What operation can I do on the tag I read to get the same required tagId every time?
Or I need your suggestions if I need to use another tag and another way. Thank you.

Video streaming via NSInputStream and NSOutputStream

Right now I'm investigating possibility to implement video streaming through MultipeerConnectivity framework. For that purpose I'm using NSInputStream and NSOutputStream.
The problem is: I can't receive any picture so far. Right now I'm trying to pass simple picture and show it on the receiver. Here's a little snippet of my code:
Sending picture via NSOutputStream:
- (void)sendMessageToStream
{
NSData *imgData = UIImagePNGRepresentation(_testImage);
int img_length = (int)[imgData length];
NSMutableData *msgData = [[NSMutableData alloc] initWithBytes:&img_length length:sizeof(img_length)];
[msgData appendData:imgData];
int msg_length = (int)[msgData length];
uint8_t *readBytes = (uint8_t *)[msgData bytes];
uint8_t buf[msg_length];
(void)memcpy(buf, readBytes, msg_length);
int stream_len = [_stream writeData:(uint8_t*)buf maxLength:msg_length];
//int stream_len = [_stream writeData:(uint8_t *)buf maxLength:data_length];
//NSLog(#"stream_len = %d", stream_len);
_tmpCounter++;
dispatch_async(dispatch_get_main_queue(), ^{
_lblOperationsCounter.text = [NSString stringWithFormat:#"Sent: %ld", (long)_tmpCounter];
});
}
The code above works totally fine. stream_len parameter after writing equals to 29627 bytes which is expected value, because image's size is around 25-26 kb.
Receiving picture via NSinputStream:
- (void)readDataFromStream
{
UInt32 length;
if (_currentFrameSize == 0) {
uint8_t frameSize[4];
length = [_stream readData:frameSize maxLength:sizeof(int)];
unsigned int b = frameSize[3];
b <<= 8;
b |= frameSize[2];
b <<= 8;
b |= frameSize[1];
b <<= 8;
b |= frameSize[0];
_currentFrameSize = b;
}
uint8_t bytes[1024];
length = [_stream readData:bytes maxLength:1024];
[_frameData appendBytes:bytes length:length];
if ([_frameData length] >= _currentFrameSize) {
UIImage *img = [UIImage imageWithData:_frameData];
NSLog(#"SETUP IMAGE!");
_imgView.image = img;
_currentFrameSize = 0;
[_frameData setLength:0];
}
_tmpCounter++;
dispatch_async(dispatch_get_main_queue(), ^{
_lblOperationsCounter.text = [NSString stringWithFormat:#"Received: %ld", (long)_tmpCounter];
});
}
As you can see I'm trying to receive picture in several steps, and here's why. When I'm trying to read data from stream, it's always reading maximum 1095 bytes no matter what number I put in maxLength: parameter. But when I send the picture in the first snippet of code, it's sending absolutely ok (29627 bytes . Btw, image's size is around 29 kb.
That's the place where my question come up - why is that? Why is sending 29 kb via NSOutputStream works totally fine when receiving is causing problems? And is there a solid way to make video streaming work through NSInputStream and NSOutputStream? I just didn't find much information about this technology, all I found were some simple things which I knew already.
Here's an app I wrote that shows you how:
https://app.box.com/s/94dcm9qjk8giuar08305qspdbe0pc784
Build the project with Xcode 9 and run the app on two iOS 11 devices.
To stream live video, touch the Camera icon on one of two devices.
If you don't have two devices, you can run one app in the Simulator; however, you can only use the camera on the real device (the Simulator will display the video broadcasted).
Just so you know: this is not the ideal way to stream real-time video between devices (it should probably be your last choice). Data packets (versus streaming) are way more efficient and faster.
Regardless, I'm really confused by your NSInputStream-related code. Here's something that makes a little more sense, I think:
case NSStreamEventHasBytesAvailable: {
// len is a global variable set to a non-zero value;
// mdata is a NSMutableData object that is reset when a new input
// stream is created.
// displayImage is a block that accepts the image data and a reference
// to the layer on which the image will be rendered
uint8_t * buf[len];
len = [aStream read:(uint8_t *)buf maxLength:len];
if (len > 0) {
[mdata appendBytes:(const void *)buf length:len];
} else {
displayImage(mdata, wLayer);
}
break;
}
The output stream code should look something like this:
// data is an NSData object that contains the image data from the video
// camera;
// len is a global variable set to a non-zero value
// byteIndex is a global variable set to zero each time a new output
// stream is created
if (data.length > 0 && len >= 0 && (byteIndex <= data.length)) {
len = (data.length - byteIndex) < DATA_LENGTH ? (data.length - byteIndex) : DATA_LENGTH;
uint8_t * bytes[len];
[data getBytes:&bytes range:NSMakeRange(byteIndex, len)];
byteIndex += [oStream write:(const uint8_t *)bytes maxLength:len];
}
There's a lot more to streaming video than setting up the NSStream classes correctly—a lot more. You'll notice in my app, I created a cache for the input and output streams. This solved a myriad of issues that you would likely encounter if you don't do the same.
I have never seen anyone successfully use NSStreams for video streaming...ever. It's highly complex, for one reason.
There are many different (and better) ways to stream video; I wouldn't go this route. I just took it on because no one else has been able to do it successfully.
I think that the problem is in your assumption that all data will be available in NSInputStream all the time while you are reading it. NSInputStream made from NSURL object has an asynchronous nature and it should be accessed accordingly using NSStreamDelegate. You can look at example in the README of POSInputStreamLibrary.

iOS - Issue recieving data from NSStream

I am making an application that senses iBeacons. When you get within immediate range of an iBeacon the application sends the major and minor numbers of the beacon to a server and the server sends back an image that is stored in a MySQL database, different images are sent back based on the major and minor numbers.
The application sends the major and minor number to a Python (Twisted sockets) script via an NSStream, the script uses these numbers to get an image from the database and send it back to the application.
This setup work great when I use it to get simple text messages back from the database but I am running into problems when trying to recieve and display images inside the application.
first I will post the code of the stream:handleEvent that recieves the data from the input stream.
The code is only a slight modification of this tutorial http://www.raywenderlich.com/3932/networking-tutorial-for-ios-how-to-create-a-socket-based-iphone-app-and-server
// input stream event that recieves the data from the server
//
- (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
{
switch (eventCode)
{
case NSStreamEventOpenCompleted:
NSLog(#"stream opened");
break;
case NSStreamEventHasBytesAvailable: // event for recieving data
NSLog(#"Recieved Data");
if (aStream == _inputStream)
{
uint8_t buffer[500000];
int len;
// loop gets bytes from input stream
//
while ([_inputStream hasBytesAvailable])
{
len = [_inputStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
NSString *str = #"data:image/jpg;base64,";
NSString *img = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
str = [str stringByAppendingString:img];
NSData *ImgOut = [NSData dataWithContentsOfURL:[NSURL URLWithString:str]];
if (nil != ImgOut)
{
self.ImageView.image = [UIImage imageWithData:ImgOut];
NSLog(#"show image");
}
}
}
}
break;
case NSStreamEventErrorOccurred:
NSLog(#"can not connect to host");
[self initNetworkComms];
break;
case NSStreamEventEndEncountered:
NSLog(#"Connection Lost");
[_outputStream close];
[_inputStream close];
[self initNetworkComms];
break;
default:
NSLog(#"unkown event");
break;
}
}
just for good measure I will post the code of the Python script
from twisted.internet.protocol import Protocol, Factory
from twisted.internet import reactor
import mysql.connector
db = mysql.connector.connect(user='NotImportant', password='WouldntYouLikeToKnow', host='localhost', database='retailbeacons')
cursor = db.cursor()
class MessageServer(Protocol):
def connectionMade(self):
self.factory.clients.append(self)
print "clients are ", self.factory.clients
def connectionLost(self, reason):
self.factory.clients.remove(self)
print "client has disconnected"
def dataReceived(self, data):
a = data.split(':')
if len(a) > 1:
Major = a[0]
Minor = a[1]
msg = ""
print "Received query " + Major + ":" + Minor
sql = "SELECT Picture FROM beaconinfo WHERE major=" + Major + " AND minor=" + Minor + ";"
cursor.execute(sql)
for row in cursor.fetchall():
mess = row[0]
msg = mess.encode('utf=8')
self.message(msg)
def message(self, message):
self.transport.write(message + '\n')
factory = Factory()
factory.protocol = MessageServer
factory.clients = []
reactor.listenTCP(8080, factory)
print "Python message test server started"
reactor.run()
what happens with this code is that when the app queries the server, the server sends back the image data (in base64 format), the application recieves this data and the EventHasBytesAvailable case of the switch statement is triggered. But only a small portion of the image is displayed and I get an error log saying:
<Error>: ImageIO: JPEG Corrupt JPEG data: premature end of data segment
This led me to believe that not all the data came across the stream. you'll see in the code that I have an NSLog say 'Recieved Data' everytime the EventHasBytesAvailable case is called and 'show image' when the UIImageView is set with the image data.
The thing I find odd, and what I feel is the source of this problem is the fact that when the EventHasBytesAvailable is called the 'Recieved Data' message is logged, then the 'show image' message is logged, then once again the 'Recieved Data' message is logged and the Error listed above is then logged.
So it looks like a small portion of the data comes in through the stream, the loop gathers up those bytes and sticks them in the UIImageView, then more bytes come in through the stream and an attempt to put them into the UIImageView is made but the 'premature end of data segment' error occurs.
I am very confused as to why this is happening. Shouldn't the whole data of the image be sent through the stream with one calling of the EventHasBytesAvailable case? Possibly I have over looked the buffer in my code? Can my buffer take an image of 60kb? That is the only thing I can think of that might be wring with the application code, then all i can think of is maybe the Python script is sending the data in two chunks instead of one.
Thank you for your time. I am an intern that has hit a bit of a wall with this one! any help will be greatly appreciated!
fixed this problem. so the stream was sending the data over in more than one call of the 'HasBytes' case. so i created a string that gets appended with each chunk of the data when 'HasBytes' gets called. i also used a different method for converting the image data string to an NSData object.
NSString *ImgStr = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
// string property for appending
//
_str = [_str stringByAppendingString:ImgStr];
NSData *ImgData = [[NSData alloc] initWithBase64EncodedString:_str options:1];
if (nil != ImgData)
{
self.ImageView.image = [UIImage imageWithData:ImgData];
}
Thanks very much!

Waiting time Epson T88V

PROBLEM:
How to print (multiple) receipts without any delay?
As far as I know, there are at least 3 options to print with a T88V on OSX or iOS device. Unfortunately, all 3 of those options has a flaw.
OPTION 1:
I've been playing around with the official OSX driver (latest version 1.2a) and have no problems printing with it. However, between every print command, there's 1 sec delay.
- (IBAction)button1:(id)sender
{
[self TMAppPrint];
}
- (OSStatus)TMAppPrint
{
TMTextView *textWindow = [[TMTextView alloc] init];
NSPrintInfo *printInfo = [NSPrintInfo sharedPrintInfo];
OSStatus err = noErr;
err = [TMPrintSupport TMSetJobTicket:printInfo printerName:#"TM-T88V" documentSize:NSMakeSize(204.0, 841.8) resolution:#"180x180dpi" speed:#"1" blank:#"Off" paperCut:#"DocFeedCut" chashDrwr1:#"Off" chashDrwr2:#"Off" buzzerControl:#"Off" buzzerPattern:#"Internal" buzzerRepeat:#"1"];
NSPrintOperation *printOperation = [NSPrintOperation printOperationWithView:textWindow printInfo:printInfo];
[printOperation setCanSpawnSeparateThread:YES];
[printOperation setShowsPrintPanel:NO];
[printOperation runOperation];
return err;
}
OPTION 2:
Using the iOS SDK (latest version 1.3.0) I can also print without problems but it's even worse. After sending the print command, there will be 1 sec delay till it prints.
- (IBAction)button2:(id)sender
{
if (printer != nil)
{
errorStatus = EPOS_OC_SUCCESS;
if (builder != nil)
{
int printStatus = EPOS_OC_SUCCESS;
// create a print document
printStatus = [builder addTextLang: EPOS_OC_LANG_EN];
printStatus = [builder addTextSmooth: EPOS_OC_TRUE];
printStatus = [builder addTextFont: EPOS_OC_FONT_A];
printStatus = [builder addTextSize: 1 Height: 1];
printStatus = [builder addTextStyle: EPOS_OC_FALSE Ul: EPOS_OC_FALSE Em: EPOS_OC_TRUE Color: EPOS_OC_PARAM_UNSPECIFIED];
// specify the print data>
printStatus = [builder addText: #"hello!\n"];
printStatus = [builder addCut: EPOS_OC_CUT_FEED];
// send data>
errorStatus = [printer sendData:builder Timeout:1000 Status: &status];
// end communication with the printer>
errorStatus = [printer closePrinter];
}
}
}
OPTION 3:
The last option is using the ESC/POS commands. I managed to get it to print some basic lines but I still don't understand most of it. There's no delay in printing though.
- (IBAction)button3:(id)sender
{
printer = [[EposPrint alloc] init];
errorStatus = [printer openPrinter:EPOS_OC_DEVTYPE_TCP DeviceName:#"192.168.1.168"];
builder = [[EposBuilder alloc] initWithPrinterModel:#"TM-T88V" Lang:EPOS_OC_MODEL_ANK];
}
I already directly asked Epson about this, but they couldn't give me any answer except it's working as intended... Which means I have to dive into the ESC/POS commands and learn it or are there still other options?

Programmatically send iMessage using private frameworks

Does anyone know if it's possible to directly send an iMessage using a private framework?
I tried using CTMessageCenter from CoreTelephony but it'll send an SMS even though my phone can send iMessages.
I haven't tested this, but look at the code posted here. If you look at httpResponseForMethod:URI:, you see where he/she sends a message (appears to be hardcode to support iOS 5 or iOS 4):
CKSMSService *smsService = [CKSMSService sharedSMSService];
//id ct = CTTelephonyCenterGetDefault();
CKConversationList *conversationList = nil;
NSString *value =[[UIDevice currentDevice] systemVersion];
if([value hasPrefix:#"5"])
{
//CKMadridService *madridService = [CKMadridService sharedMadridService];
//NSString *foo = [madridService _temporaryFileURLforGUID:#"A5F70DCD-F145-4D02-B308-B7EA6C248BB2"];
NSLog(#"Sending SMS");
conversationList = [CKConversationList sharedConversationList];
CKSMSEntity *ckEntity = [smsService copyEntityForAddressString:Phone];
CKConversation *conversation = [conversationList conversationForRecipients:[NSArray arrayWithObject:ckEntity] create:TRUE service:smsService];
NSString *groupID = [conversation groupID];
CKSMSMessage *ckMsg = [smsService _newSMSMessageWithText:msg forConversation:conversation];
[smsService sendMessage:ckMsg];
[ckMsg release];
} else {
//4.0
id ct = CTTelephonyCenterGetDefault();
void* address = CKSMSAddressCreateWithString(pid);
int group = [grp intValue];
if (group <= 0) {
group = CKSMSRecordCreateGroupWithMembers([NSArray arrayWithObject:address]);
}
void *msg_to_send = _CKSMSRecordCreateWithGroupAndAssociation(NULL, address, msg, group, 0);
CKSMSRecordSend(ct, msg_to_send);
}
The code uses normal SMS, but you can see the following commented out code:
//CKMadridService *madridService = [CKMadridService sharedMadridService];
The "Madrid" service is probably what can send iMessages. See the private header here.
Both SMS and iMessage private APIs are in the ChatKit.framework.
Through a non jailbreak iPhone there is absolutely no access to the iMessage CoreTelephony API

Resources