AS3 DatePicker Air for iOS - ios

I'm showing date picker using following ane
https://github.com/freshplanet/ANE-DatePicker
In iphone it always show datepicker at the bottom. I take a look at AirDatePicker.as, in iPad I can do it, but not found any way for custom positioning in iphone.

This can be done using FPNativeUI.ane:
package control
{
import flash.display.Sprite;
import flash.geom.Rectangle;
import flash.system.Capabilities;
import ru.flashpress.nui.FPNativeUI;
import ru.flashpress.nui.view.control.FPDatePicker;
import ru.flashpress.nui.events.FPDatePickerEvent;
import ru.flashpress.nui.view.system.FPWindowView;
public class DatePickerProgrammatically extends Sprite
{
private var datePicker:FPDatePicker;
public function DatePickerProgrammatically()
{
FPNativeUI.init();
//
var w:int = Capabilities.screenResolutionX;
//
var bounds:Rectangle = new Rectangle(0, 100, w, 400);
FPWindowView.window.screen.boundsFlashToNative(bounds);
//
datePicker = new FPDatePicker();
datePicker.frame = bounds;
//
datePicker.stage.addChild(datePicker);
datePicker.addEventListener(FPDatePickerEvent.VALUE_CHANGED, valueChangeHandler);
}
private function valueChangeHandler(event:FPDatePickerEvent):void
{
trace(event.date);
}
}
}

Related

React Native "imports"

I am trying to develop a react-native book-searching app and have found this online as a helper tool: https://www.appcoda.com/react-native-introduction/
However, the site is from 2015, so some of the syntax is not properly updated.
I've run into the following issue:
The code that they tell me to use is as follows:
'use strict';
var React = require('react-native');
var Books = require('./Books');
var Search = require('./Search');
var {
AppRegistry,
TabBarIOS,
Component
} = React;
When I used that, I arrived at an Error Message telling me to use
import React, { Component } from 'react';
instead of
import React from 'react-native';
In my attempt to update it to the latest version of React, I arrived at:
'use strict';
import React, { Component } from 'react';
var Books = require('./Books');
var Search = require('./Search');
var {
AppRegistry,
TabBarIOS
} = React;
This is still causing multiple errors. Can someone explain how to properly do what I am trying to do?
You're trying to import AppRegistry and TabBarIOS from React instead of react-native and some syntax errors. Change:
var {
AppRegistry,
TabBarIOS
} = React;
to
import {
AppRegistry,
TabBarIOS
} from 'react-native';

Whole page screenshot, Java

Here is my problem - I have a desktop application written in JavaFX. I need to show a full-screen webpage to an user and save the rendered page as PNG. I need to save the whole page (e.g. resolution 1920×3500).
Now I'm using Selenium and Firefox to do this. It works fine but there is one big disadvantage - the user must have Firefox installed on his machine.
I've tried to render the webpage using WebView by JavaFX and take a .snapshot(). This would be great but this approach doesn't give me the whole page, only the visible part of the WebView. Is there any approach how to get the whole page snapshot using WebView? Or any other ideas how to do that? Thanks.
You may want to check out this post. I don't know if it is working or not, but it seems like a reasonable solution.
After a lot of searching and scraping several pieces together I found that the only problem I had with an example some SO-poster posted in an oracle forum was that the size of the webview was fixed and that my css used in the html (not in JavaFX) needed.
overflow-x: hidden;
overflow-y: hidden;
to hide the last scrollbar.
So I come up with the following snapshot method (application with animation just as example of your application) See if it works for your sizes, otherwise you might scale javafx.scene.web.WebView.setZoom(double value) the webview down to something that works, you might lose some resolution, but at least have the whole picture:
package application;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;
import javafx.animation.Animation;
import javafx.animation.PauseTransition;
import javafx.animation.TranslateTransition;
import javafx.application.Application;
import javafx.embed.swing.SwingFXUtils;
import javafx.event.ActionEvent;
import javafx.event.EventHandler;
import javafx.geometry.Insets;
import javafx.geometry.Pos;
import javafx.scene.Scene;
import javafx.scene.control.Button;
import javafx.scene.control.Label;
import javafx.scene.effect.GaussianBlur;
import javafx.scene.image.WritableImage;
import javafx.scene.layout.AnchorPane;
import javafx.scene.layout.BorderPane;
import javafx.scene.layout.Pane;
import javafx.scene.paint.Color;
import javafx.scene.shape.Rectangle;
import javafx.scene.web.WebView;
import javafx.stage.Modality;
import javafx.stage.Stage;
import javafx.stage.StageStyle;
import javafx.util.Duration;
public class WebViewSnapshot extends Application {
BorderPane rootPane;
TranslateTransition animation;
#Override
public void start(Stage primaryStage) {
Rectangle rect = new Rectangle(50, 50, 50, 50);
rect.setFill(Color.CORAL);
animation = createAnimation(rect);
Button snapshotButton = new Button("Take snapshot");
Pane pane = new Pane(rect);
pane.setMinSize(600, 150);
rootPane = new BorderPane(pane, null, null, snapshotButton, new Label("This is the main scene"));
snapshotButton.setOnAction(e -> {
// html file being shown in webview
File htmlFile = new File ("generated/template.html");
// the resulting snapshot png file
File aboutFile = new File ("generated/about.png");
generate(htmlFile, aboutFile, 1280, 720);
});
BorderPane.setAlignment(snapshotButton, Pos.CENTER);
BorderPane.setMargin(snapshotButton, new Insets(5));
Scene scene = new Scene(rootPane);
primaryStage.setScene(scene);
primaryStage.show();
}
private TranslateTransition createAnimation(Rectangle rect) {
TranslateTransition animation = new TranslateTransition(Duration.seconds(1), rect);
animation.setByX(400);
animation.setCycleCount(Animation.INDEFINITE);
animation.setAutoReverse(true);
animation.play();
return animation;
}
public void generate(File htmlFile, File outputFile, double width, double height) {
animation.pause();
// rootPane is the root of original scene in an FXML controller you get this when you assign it an id
rootPane.setEffect(new GaussianBlur());
Stage primaryStage = (Stage)rootPane.getScene().getWindow();
// creating separate webview holding same html content as in original scene
WebView webView = new WebView();
// with the size I want the snapshot
webView.setPrefSize(width, height);
AnchorPane snapshotRoot = new AnchorPane(webView);
webView.getEngine().load(htmlFile.toURI().toString());
Stage popupStage = new Stage(StageStyle.TRANSPARENT);
popupStage.initOwner(primaryStage);
popupStage.initModality(Modality.APPLICATION_MODAL);
// this popup doesn't really show anything size = 1x1, it just holds the snapshot-webview
popupStage.setScene(new Scene(snapshotRoot, 1, 1));
// pausing to make sure the webview/picture is completely rendered
PauseTransition pt = new PauseTransition(Duration.seconds(2));
pt.setOnFinished(new EventHandler<ActionEvent>() {
#Override public void handle(ActionEvent event) {
WritableImage image = webView.snapshot(null, null);
// writing a png to outputFile
// writing a JPG like this will result in a pink JPG, see other posts
// if somebody can scrape me simple code to convert it ARGB to RGB or something
String format = "png";
try {
ImageIO.write(SwingFXUtils.fromFXImage(image, null), format, outputFile);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
rootPane.setEffect(null);
popupStage.hide();
animation.play();
}
}
});
// pausing, after pause onFinished event will take + write snapshot
pt.play();
// GO!
popupStage.show();
}
public static void main(String[] args) {
launch(args);
}
}

ActionScript contained in externally loaded SWF files will be ignored on iOS devices

I am developing an IOS app in Flash CS6. I'm trying to load an external swf. The preloader loads the external swf. The loading itself seems to work. After loading the external swf, I add its movieclips to the stage. But strangely, on iOS devices, the movieclips are not added to stage.
Here's the code for the preloader:
package {
import flash.display.MovieClip;
import flash.display.BitmapData;
import flash.display.Loader;
import flash.display.Bitmap;
import flash.utils.getDefinitionByName;
import flash.events.Event;
import flash.events.ProgressEvent;
import flash.net.URLRequest;
import flash.system.LoaderContext;
import flash.system.ApplicationDomain;
public class Test extends MovieClip {
private var lobbyBgAssetsLoader:Loader = new Loader() ;
private var ldrContext:LoaderContext;
public function Test() {
var urlStr:String = "cards.swf";
lobbyBgAssetsLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, LobbyBgAssetsswfLoaded);
lobbyBgAssetsLoader.load(new URLRequest(urlStr));
ldrContext = new LoaderContext(false,ApplicationDomain.currentDomain,null);
}
private function LobbyBgAssetsswfLoaded(e:Event):void
{
txt.text = "loaded complete test ";
var logoSmall:Bitmap = new Bitmap();
var classDefinition:Class = lobbyBgAssetsLoader.contentLoaderInfo.applicationDomain.getDefinition("cardBg") as Class;
txt.text = "after load ";
var img:MovieClip = new classDefinition() as MovieClip;
this.addChild(img);
}
}
}
All actionscript gets converted into native Obj-C code when you compile your app for iOS. So even if your app is able to load an SWF file from the internet, the actionscript inside will not be converted to Obj-C.
Apple does not allow the actual flash player inside an iOS app, so an app cannot play back SWF content.

Adobe AIR for iOS stream from iPhone Camera to RTMP server

I have a client .ipa file that i test on my iOS device, now i have successfully got the app to run on iPhone by using Adobe Air for IOS and for this i am using Adobe Flash CC.
When i launch the app on the iPhone the video connection does not connect to the red5 streaming server and therefore can not broadcast the stream from camera to server.
I have used stagevideo. When i launch the app on a local pc with web cam and launch another app on the iOS to receive the stream on iPhone i can see the live broadcast from my pc webcam.
But i want to test the iPhone camera and send receive live streaming from red5 server.
How can i achieve this. I have placed the current code below.
import flash.display.Sprite;
import flash.display.MovieClip;
import flash.events.NetStatusEvent;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.media.Camera;
import flash.media.Microphone;
import flash.media.Video;
import flash.net.Responder;
import flash.media.StageVideo;
import flash.events.StageVideoAvailabilityEvent;
import flash.events.StageVideoEvent;
import flash.geom.Rectangle;
var nc:NetConnection;
var good:Boolean;
var netOut:NetStream;
var netIn:NetStream;
var cam:Camera;
var mic:Microphone;
var responder:Responder;
var r:Responder;
var vidOut:Video;
var vidIn:Video;
var outStream:String;
var inStream:String;
var sv:StageVideo;
stage.addEventListener(StageVideoAvailabilityEvent.STAGE_VIDEO_AVAILABILITY, onAvail);
var sva:Boolean;
function onAvail(e:StageVideoAvailabilityEvent):void{
sva = (e.availability == StageVideoAvailability.AVAILABLE);
trace(sva);
var rtmpNow:String="rtmp://192.168.1.7/test1";
nc=new NetConnection;
nc.client = this;
nc.connect(rtmpNow,"trik");
nc.addEventListener(NetStatusEvent.NET_STATUS,getStream);
}
function onRender(e:StageVideoEvent):void{
sv.viewPort = new Rectangle(0,0, 240, 180);
}
function getStream(e:NetStatusEvent):void
{
good=e.info.code == "NetConnection.Connect.Success";
if(good)
{
trace("hello");
// Here we call functions in our Java Application
setCam();
//setMic();
//Play streamed video
netIn = new NetStream(nc);
if(sva){
//Publish local video
netOut=new NetStream(nc);
//netOut.attachAudio(mic);
netOut.attachCamera(cam);
netOut.publish("tester", "live");
sv = stage.stageVideos[0];
sv.addEventListener(StageVideoEvent.RENDER_STATE, onRender);
sv.attachNetStream(netIn);
netIn.play("tester");
}else{
setVid();
vidIn.attachNetStream(netIn);
netIn.play("tester");
}
}
}
function streamNow(streamSelect:Object):void
{
trace("hello");
}
function setCam():void
{
cam=Camera.getCamera();
cam.setMode(240,180,15);
cam.setQuality(0,85);
}
function setMic():void
{
trace("hello");
mic=Microphone.getMicrophone();
trace("hello");
mic.rate =11;
trace("hello");
//mic.setSilenceLevel(12,2000);
trace("hello");
}
function setVid():void
{
trace("vid");
vidIn=new Video(240,180);
addChild(vidIn);
vidIn.x=0;
vidIn.y=0;
}
Your code mostly looks fine, but I would separate the ns.publish and ns.play parts. IMHO you shouldn't try to play until the publish is successful. Also if you're not just testing the roundtrip to the server, I would just attach the camera to the StageVideo; if that's allowed in iOS.

Flare3D FlashBuilder Mobile Error

package
{
import flash.display.Sprite;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.Event;
import flash.display.Sprite;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.Event;
import flare.basic.Scene3D;
import flare.basic.Viewer3D;
import flare.core.Pivot3D;
import flare.basic.Scene3D;
import flare.basic.Viewer3D;
import flare.core.Pivot3D;
[SWF(frameRate = 60, width = 800, height = 450, backgroundColor = 0x000000)]
/**
* Starting the project.
*/
public class MyFirstApp extends Sprite
{
private var scene:Scene3D;
private var planet:Pivot3D;
private var astronaut:Pivot3D;
public function MyFirstApp()
{
trace("Hello Flare3D");
}
}
}
I am trying to get Flare3D to work in an ActionScript Mobile Project in Flash Builder.When I run the project I am recieving an error message that says ... " VerifyError: Error #1014: Class mx.core::ByteArrayAsset could not be found. " All I have in this code is three Vairables, I have removed all other code and still I recieve this message. I have added Flare3D into the ActionScript BulderPath. My Goal is to take the Yellow Planet Tutorial and run it on iOS Air Simulator. http://www.flare3d.com/demos/yellowplanet/.... If I dont include the 3 variables I get the trace statement to execute with no error message. How does adding three variable cause this error, or rather what am I overlooking here?
Fixed error mx.core::ByteArrayAsset by adding the Flex SDK into my Actionscript Mobile project. By using Scene3D this must be linked to the project.

Resources