I am using Azure Atlas Map in my Azure Web app. I want to move the symbols smoothly without refreshing the whole page.
I did some changes in existing code. I have set a time interval in ajax call and adding the new symbol layer in the map.
But facing issues.
I am getting error
map is undefined.
here is the code
function GetJsonMap(jsondata) {
if (typeof jsondata !== 'undefined') {
var gps_data = jsondata;
for (var i = 0; i < gps_data.length; i++) {
var point = new atlas.data.Point([gps_data[i][0], gps_data[i][1]]);
var feature = new atlas.data.Feature(point, { name: gps_data[i][2], description: '[' + gps_data[i][0] + ", " + gps_data[i][1] + ']' });
datasource.add(feature);
}
//Add a layer for rendering point data as symbols.
var symbolLayer = new atlas.layer.SymbolLayer(datasource, null, {
iconOptions: {
image: 'pin-red'
}
});
debugger;
// $("#iotmap")
map.layers.add(symbolLayer); -->getting error here
}
}
I am calling this GetJsonMap in ajax call.
If you are getting an error that map is null it means that its out of scope in your code. Is it a global or local variable?
Looking at your code I highly recommend that you create the symbol layer after the maps ready event has fired and only create it once. Majority of the code samples for Azure Maps do this. The way your code is now, it will add a new layer every the GetJsonMap function is called. As such, you will end up with several layers over time that are trying to render the same data on the map. Note that the data is managed by the data source, layers only render what's in the data source.
Another tip, you are looping through and adding each point in one by one to the data source. A slightly faster way is to add all the points to an array and then add the array to the data source. Every time data is added to the data source it tries to update the map. Additionally, if you want to overwrite all data on the map, instead of clearing the data source and then adding your data, use the setShapes function of the data source. This will do the clear and add in one function and only trigger a single render update on the map.
Related
I am developing a master detail Fiori app using SAP UI5. As the details contains more than 40 columns, I made separate OData services for master & detail.
In Master page, data are coming correctly. Now my task is that on any table line, when user clicks on Detail, next page will be open with details base on two key values of master table.
I'm getting two keys in variables in detail page as follows and it is working fine:
var spayid = jQuery.sap.getUriParameters().get("payid");
var spaydt = jQuery.sap.getUriParameters().get("paydt");
Next, I have created two filters as follows which is also working fine.
var filter1 = new Filter({
path: "Laufi",
operator: FilterOperator.EQ,
value1: spayid
});
var filter2 = new Filter({
path: "Laufd",
operator: FilterOperator.EQ,
value1: spaydt
});
Now I am calling OData service which is also working fine:
var oODataModel = new ODataModel("proxy/http/FIORI-DEV.abc.com:8000/sap/opu/odata/sap/ZASA_FI_pay_D_SRV?sap-client=100", {
json: true,
useBatch: false
});
this.getView().setModel(oODataModel);
I don't know now how to filter data. What should be included in above so that it will filter data according to my filters filer1 and filter2? I have tried following but it is not working.
filters : [ filter1, filter2 ],
json: true,
useBatch: false
I am very good in ABAP but not an expert in SAPUI5. I am in learning phase.
First of all, I was thinking to pass parameters on OData service so that only the required data are fetched. Means my OData call should be like this:
new ODataModel("proxy/http/FIORI-DEV.abc.com:8000/sap/opu/odata/sap/ZASA_FI_PAYMENT_D_SRV/PdetailSet(Laufi= spayid, Laufd = spaydt)?sap-client=100");
But this seems not like possible.
Second option is that I will fetch whole details in OData service and then during binding to table I will apply filter.
The purpose of the sap.ui.model.Filter class is usually to apply filters to lists on the UI. For example, if you have a list of items and you want to limit that list to a subset of items which fulfills certain criteria.
But what you have here appears to be a classic master-detail scenario where you have a list of items and then when the user selects one show more information about that one item.
The usual solution for such a scenario is to assign the full model to the detail-view and then use an element binding (also known as "context binding") on the view to tell it which item to display.
When the source of the item is a click on an element which already had an element binding, then you can actually retrieve the correct binding path from the click event and just apply it to your detail-view.
From one of the official demos:
onItemSelected: function(oEvent) {
var oSelectedItem = oEvent.getSource();
var oContext = oSelectedItem.getBindingContext("products");
var sPath = oContext.getPath();
var oProductDetailPanel = this.byId("productDetailsPanel");
oProductDetailPanel.bindElement({ path: sPath, model: "products" });
}
When you don't have any convenient way to get an element path from, then you have to construct one yourself:
var detailPanel = this.getView().byId("idOfDetailPanel");
detailPanel.bindElement("PdetailSet(Laufi = " + spayid +", Laufd = " + spaydt + ")");
The latter code snippet does of course assume that the oData-service actually supports access with a key consisting of laufi and laufd. This is decided by:
The definition of the key fields of the entity type in the SAP Gateway Service Builder (transaction SEGW)
The ABAP implementation of the method get_entity of the data provider class of that oData-service.
I'm writing a function that will parse certain websites and fetch data from there, which will be used to create instances of a class. I'm able to successfully extract the data when it is retrieved using the getElementById() function, but for some reason, the getElementsByClassName() always returns a node list with 0 elements.
The site I'm currently parsing is here.
If you search for 'datas-nev', you will find exactly one match:
<p class="datas-nev"><b>Kutya neve: </b>Jhonny</p>
And here is the code use for parsing:
import 'package:html/parser.dart' show parse;
...
final response = await http.get(URL);
var document = parse(response.body);
var detailsContainer = document.getElementById('husky_details_container_right');
var dogName = new List<Node>();
dogName = document.getElementsByClassName('datas-nev');
The contents of the detailsContainer can be extracted successfully, for example this gives me back a string of relevant data I will use later:
var humanBehaviourValue;
try { humanBehaviourValue = detailsContainer.nodes[1].nodes[19].nodes[1].nodes[7].nodes[1].toString(); }
catch (e) { humanBehaviourValue = 'N/A'; }
But when I check the value of dogName in the debug window, I get the following:
dogName = {_growableList} size = 0
I already tried initializing the dogName 'properly' by List<Node> dogName = new List<Node>(); but it didn't help. I also tried other datas-* values, but it seems the parser can't find them. I even tried using just datas (because that is a div, while others are paragraphs), but that didn't help either.
Basically I could just hardwire the name and some data (breed, color, etc) as those never really change, but the location of the shelter can change, and keeping it up-to-date by scraping the data seems better than pushing updates out manually. That means I mostly need the value of datas-helyszin but that isn't parsed either.
As #Günter Zöchbauer pointed out, the code actually works. I was just looking for the value too soon, before it was actually fetched...
I have a list of products that I download as json file from server. Each item contains a link to the image stored on server.
Now I want to be able to see the products when offline, so I store downloaded json file into forge.prefs http://docs.trigger.io/en/v1.3/modules/prefs.html and pull it out to display items on screen. It works nice but I also need to store images localy to be displayed correctly.
To achive this, I'm trying to use forge.file.cacheURL http://docs.trigger.io/en/v1.3/features/cache.html but can't handle the correct order of operations. To cache images I run the json file and for each line I call forge.file.cacheURL and store the url back to JSON item. But here is the problem as forge.file.cacheURL runs asynchronously so my loop running over the items and gathering the local images finishes and my code continues to display images(view items) but meantime the forge.file.cacheURL still gathers and caches the images because its asynchronous operation. I need somehow to detect that last item is being cached and then refresh the view on screen to use correct image urls ... or something else that will lead to what I need.
Hopefully you understand the concept. How should I handle this properly ?
Since v1.4.26 you've been able to permanently store images (see http://docs.trigger.io/en/v1.4/release-notes.html#v1-4-26), rather than just cache them. Depending on your needs, that might be a better option than forge.file.cacheURL and forge.file.isFile.
I don't follow exactly the situation you describe, but something like this will let you wait for several asynchronous things to finish before doing something:
// e.g.
var jsonCache = {
one: "http://example.com/one.jpg",
two: "http://example.com/two.jpg",
three: "http://example.com/three.jpg"
};
var cacheCount = 0;
for (var name in jsonCache) {
if (jsonCache.hasOwnProperty(name)) {
var imageURL = jsonCache[name];
cacheCount += 1;
forge.file.cacheURL(imageURL, function (file) {
forge.prefs.set(name, file, function () {
cacheCount -= 1; // race condition, but should be fine (!)
if (cacheCount <= 0) {
alert('all cached');
}
});
});
}
}
I've never used actionscript before, and but I've just had to dive into it in order to get a map working.
I'm using the following code to add a map marker, replacing a previous one if one exists:
public var tracer:Array = new Array();
public var tracerLng:Number = 0;
for ( var i : Number=1 ; i<64000 ; i++)
{
//Check if there is already a marker, if so get rid of it
if(tracerLng > 0) {
map.removeOverlay(tracer[0]);
tracer[0] = null;
tracer.pop();
}
// Set up a marker
var trackMrk:Marker = new Marker(
new LatLng(_lat, _lng),
new MarkerOptions({
strokeStyle: new StrokeStyle({color: 0x987654}),
fillStyle: new FillStyle({color: 0x223344, alpha: 0.8}),
radius: 12,
hasShadow: true
})
);
//Add the marker to the array and show it on the map
tracerLng = tracer.push(trackMrk);
map.addOverlay(tracer[0]);
}
My first problem is that running this code (The 64000 repeats are for testing, the final application won't need to be run quite THAT many times). Either way, memory usage increases by about 4kB/s - how do I avoid that happening?
Secondly - could anyone advise me on how to make that program more graceful?
Thanks in advance for advice
This isn't a memory leak, it's probably the result of created events - enter frames, mouse events, custom events etc. Provided that your memory doesn't keep going up and up forever, it's nothing to be worried about - it'll get garbage collected in due course.
Some points on your code:
The tracer Array doesn't seem to do anything - you only seem to be holding one thing in there at a time, so an array makes no sense. If you need an Array, use Vector instead. It's smaller and faster. More so if you create one with a specific length.
Don't create a new Marker unless you need one. Reuse old objects. Learn about object pooling: http://help.adobe.com/en_US/as3/mobile/WS948100b6829bd5a6-19cd3c2412513c24bce-8000.html or http://lostinactionscript.com/2008/10/30/object-pooling-in-as3/
The LatLng and MarkerOptions (including the stroke and fill objects) don't seem to change (I'm assuming the LatLng object lets you set a new position). If that's the case, don't create new ones when you don't need to. If you need to create new ones, StrokeStyle and FillStyle seem good candidates for a "create once, use everywhere" policy.
Create a destroy() function or similar in your Marker class and explicitly call it when you need to delete one (just before setting it to null or popping it from the array). In the destroy() function, null out any parameters to non-base classes (int, Number, String etc). Garbage collection runs using a reference counting method and a mark and sweep method. Ideally, you want to run everything using reference counting as it's collected quicker and stops any stalls in your program.
I explain memory management in AS3 a bit more here: http://divillysausages.com/blog/tracking_memory_leaks_in_as3
Also included is a class that helps you track down memory leaks if there are any
I am a new to db4o. I have a big problem with persistance of a graph of objects. I am trying to migrate from old persistance component to new, using db4o.
Before I peristed all objects its graph looked like below (Take a look at Zrodlo.Metadane.abstrakt string field with focused value) [its view from eclipse debuger] with a code:
ObjectContainer db=Db4o.openFile(DB_FILE);
try {
db.store(encja);
db.commit();
} finally{
db.close();
}
After that, I tried to read it with a code:
ObjectContainer db=Db4o.openFile((DB_FILE));
try{
Query q = db.query();
q.constrain(EncjaDanych.class);
ObjectSet<Object> objectSet = q.execute();
logger.debug("objectSet.size" + objectSet.size());
EncjaDanych encja = (EncjaDanych) objectSet.get(0);
logger.debug("ENCJA" + encja.toString());
return encja;
}finally{
db.close();
}
and I got it (picture below) - string field "abstrakt" is null now !!!
I take a look at it using ObjectManager (picture below) and abstrakt field has not-null value there!!! The same value, that on the 1st picture.
Please help me :) It is my second day with db4o. Thanks in advance!
I am attaching some code with structure of persisted class:
public class EncjaDanych{
Map mapaIdRepo = new HashMap();
public Map mapaNazwaRepo = new HashMap(); }
!!!!!!!!UPDATED:
When I tried to read only Metadane object (there was only one such a object), it is all right - it's string field abstrakt could be read correctly.
try{
Query q = db.query();
q.constrain(Metadane.class);
ObjectSet<Object> objectSet = q.execute();
logger.error("objectSet.size" + objectSet.size());
Metadane meta = (Metadane) objectSet.get(0);
logger.debu("Metadane" + meta.toString());
return meta;
}finally{
db.close();
}
This is a common db4o FAQ, an issue with what db4o calls "activation". db4o won't instantiate the entire graph you stored when you load an object from an ObjectContainer. By default, objects are instantiated to depth 5. You can change the default configuration to a higher value, but that is not recommended since it will slow down object loading in principle because the depth will be used everywhere you load an object with a query.
Two approaches are possible to solve your issue:
(1) You can activate an object to a desired depth by hand when you need a specific depth.
db.activate(encja, 10) // 10 is arbitrary
(2) You can work with Transparent Activation. There are multiple chapters on how to use Transparent Activation (TA) in the db4o tutorial and in the reference documentation.
You're not setting a filter in your query so you're reading the first object. Are you sure you didn't have a previous object in the database?