Getting Data from a Web Service - wsdl

I did the following:
C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin>wsdl.exe http://ws.strikeiron.com/CensusData?WSDL
This created a following file: CensusData.cs which was then added to my C# project.
I would like to see some data now so:
CensusData CD = new CensusData();
Looking at the inteli sense I figured out that I will probably need to use one of the methods named: BeginGetEconomic_ForState and here is where my C# skills are not good enough anymore.
After typing CD.BeginGetEconomic_ForState("Utah", the intelisense shows AsyncCallback callbackm object asyncState)
What exactly should I do to get the data ?

Related

JShell: Accessing Objects Created by Snippets

I am very confused about something and I would appreciate some insight here.
Say I want to build a GUI that visualizes what is going on inside JShell, i.e. how the, by snippets created objects reference each other and what, by Snippets created objects are contained inside my running instance of JShell. How do I access these objects, and most of all, how do I access how they reference each other?
A concrete example: I create a JShell instance, pass it a few snippets created by the user, which cause the creation of, for example, an ArrayList, a few objects, and add said objects to said ArrayList.
How do I access this ArrayList and the objects contained within it to visualize this in a GUI?
To clarify further:
//say I create a Jshell:
JShell jShell = JShell.create();
//Which then evauletes user code passed from the GUI:
jShell.eval(userCode)
//userCode could be following lines each passed as separate Strings:
“ArrayList<TestObject> allObj = new ArrayList<TestObject>();”
“TestObject tst = new TestObject();”
“TestObject tst2 = new TestObject();”
“allObj.add(tst);”
“allObj.add(tst2);”
How do I access “allObj”?
How do I access “tst” and the object it points to? (the “TestObject”instance that “tst” points to);
I know eval() returns a list of SnippetEvents which contain the changed/added snippets, however, I can’t get my head around how to access the objects created by those snippets.
Assuming your classpath has access to the TestObj you could implement Serializable on that object. Upon completion of the eval run another method automatically that serializes the output. Then you can deserialize that object inside your code.

Using Dart for HTML5 app, but want to load a file from the server-side

I'm new to Dart, and trying to create my first Dart web game. Sorry if I missed an answered question related to this. I did search, but wasn't having much luck.
To load a level, I would like to be able to read in a text file with the level data, then process that and use it to build the level. Unfortunately, I am running into the issue where dart:io and dart:html can not both be loaded if you want to use the File() object. From what I can tell, dart:html's File() object is client-side, so that would not be able to open the text-file I will have on the server.
Is there another way to read in a text file from the server? If not, do I need to set up a database just to store the game data, or is there a better option I'm not thinking about here?
In case it helps, the game data I'm working with currently is just a text file that gives a map of what the level will look like. For example:
~~~~Z~~~~
P
GGGGLLGGG
Each of those characters would denote a type of block to be placed in the level. It's not the best way to store levels, but it is pretty easy to create and easy to read in and process.
Thanks so much for the help!
If the file you are loading is a sibling of the index.html your game is loaded from, then you can just make an HTTP request to fetch the file.
To download web/level1.json you can use
Future<String> getGameData(String name) {
var response = await HttpRequest.getString('${name}.json');
print(response);
return response;
}
otherMethod() async {
var data = await getGameData('level1');
}
See also https://api.dartlang.org/stable/1.24.3/dart-html/HttpRequest-class.html

Sensenet API: Best Way to Create Folder structure

I want upload content to a specific path on sensenet. This path may not be already created on sensenet. So if the path do not exists, the system has to create it.
Using the Client API of Sensenet, the method available to create content runs Asynchronous. I tried to force it to run Synchronous but it seems not to happen, becaus sometimes the second folder is not created...
Here's a sample code:
private async Task CreateFolder(String parentPath, String folderName){
var folder = Content.CreateNew(parentPath, "Folder", folderName);
await folder.SaveAsync();
}
CreateFolder("/Root/Sites/Test/DocumentWorkSpace", "folder").Wait();
CreateFolder("/Root/Sites/Test/DocumentWorkSpace/folder", "subfolder").Wait();
I can use Tools.EnsurePathAsync(path) to create folder structure. But after this, I want to upload the file... (I'm having the same problem of the folder structure reported above.)
Task.Run(() => Tools.EnsurePathAsync(pathDocType)).Wait();
Task.Run(() =>{
var stream = new MemoryStream(byteContent);
Content.UploadAsync(pathDocType, "test.doc", stream).WaitAndUnwrapException();
stream.Dispose();
}).Wait();
You have multiple options, depending on your use case.
Importing a whole folder structure
Take a look at the import api in the client library. It is actually a single method that you can use to import a folder structure from the file system. It handles all folder creation and upload internally:
await Importer.ImportAsync(sourcePath, targetPath, options);
The options object can be used to customize the behavior of the algorithm (e.g. max degree of parallelism for large structures, defining a custom container type instead of the default Folder or defining an optional logging delegate method).
This method is scalable, meaning it is capable of importing a huge number of folders and files efficiently. It will return after importing every folder and file (filtered by the optional filters in the options parameter).
Ensuring a parent folder chain
There is a single method for creating parent folders (if they do not exist yet).
await Tools.EnsurePathAsync(path);
You can call this with a non-existing path (e.g. /Root/Folder1/Folder2) and it will create it for you. This method only deals with folders, it has nothing to do with files.

EF - generic "AddOrUpdate" method suddenly breaks

I am using Entity Framework 4 (database-first approach) in my ASP.NET 4.0 Webforms app.
What I'm basically doing is fetching the entity to be edited from my ObjectContext, and displaying the fields the user should enter data into (or modify existing data) on a web form.
When time comes to store the data back, I'm reading out the values from the web form, building up a new Entity instance, and then I have a generic method called AddOrUpdate that detects whether this is a new entity (so it needs to insert it), or if it's an existing one (so it needs to update the existing data).
My method using the EntityKey and checks to see if the object context already knows about this object - very similar to what Cesar de la Torre of Microsoft shows here in his blog post:
public static void AddOrUpdate(ObjectContext context, EntityObject objectDetached)
{
if (objectDetached.EntityState == EntityState.Detached)
{
object currentEntityInDb = null;
if (context.TryGetObjectByKey(objectDetached.EntityKey, out currentEntityInDb))
{
// attach and update the existing entity
}
else
{
// insert new entity into entity set
context.AddObject(objectDetached.EntityKey.EntitySetName, objectDetached);
}
}
}
This worked just fine - for the longest time. But today, suddenly, out of the blue, I keep getting exceptions like this on the context.TryGetObjectByKey statement:
System.InvalidOperationException: Object mapping could not be found for Type with identity 'MyEntityType'
I cannot remember having changed anything in this core code at all - and the entity type is defined, the ID value that's stored in the EntityKey does indeed exist in the database... everything should be fine - but it keeps failing on me...
What on earth happened here??
I did find a few blog and forum posts on the topic, but none could really enlighten me or help me fix the issue. I must have messed up something - bad - but I really cannot see the forest for the trees - any hints?
Generally this sort of issue happens when EF cant find the assembly that has the type. With out seeing the full exception is difficult to figure out exactly but it seems your recent changes and the way you are using EF seems to be the cause.
EF ususally picks the type directly from the type itself when it has to access it using ObjectSet on the context. In the other cases where the type is not available from the context of the call it looks at the calling assembly and any dll's referenced by the calling assembly. Id it cant find it it throws the error message.
You can use the LoadFromAssembly method in the MetadataWorkspace of the context.
ObjectContext.MetadataWorkspace.LoadFromAssembly(assembly).
This way EF will know where to look for your types.

Is it OK to open a DB4o file for query, insert, update multiple times?

This is the way I am thinking of using DB4o. When I need to query, I would open the file, read and close:
using (IObjectContainer db = Db4oFactory.OpenFile(Db4oFactory.NewConfiguration(), YapFileName))
{
try
{
List<Pilot> pilots = db.Query<Pilot>().ToList<Pilot>();
}
finally
{
try { db.Close(); }
catch (Exception) { };
}
}
At some later time, when I need to insert, then
using (IObjectContainer db = Db4oFactory.OpenFile(Db4oFactory.NewConfiguration(), YapFileName))
{
try
{
Pilot pilot1 = new Pilot("Michael Schumacher", 100);
db.Store(pilot1);
}
finally
{
try { db.Close(); }
catch (Exception) { };
}
}
In this way, I thought I will keep the file more tidy by only having it open when needed, and have it closed most of the time. But I keep getting InvalidCastException
Unable to cast object of type 'Db4objects.Db4o.Reflect.Generic.GenericObject' to type 'Pilot'
What's the correct way to use DB4o?
No, it's not a good idea to work this way. db4o ObjectContainers are intended to be kept open all the time your application runs. A couple of reasons:
db4o maintains a reference system to identify persistent objects, so it can do updates when you call #store() on an object that is already stored (instead of storing new objects) . This reference system is closed when you close the ObjectContainer, so updates won't work.
Class Metadata would have to be read from the database file every time you reopen it. db4o would also have to analyze the structure of all persistent classes again, when they are used. While both operations are quite fast, you probably don't want this overhead every time you store a single object.
db4o has very efficient caches for class and field indexes and for the database file itself. If you close and reopen the file, you take no advantage of them.
The way you have set up your code there could be failures when you work with multiple threads. What if two threads would want to open the database file at exactly the same time? db4o database files can be opened only once. It is possible to run multiple transactions and multiple threads against the same open instance and you can also use Client/Server mode if you need multiple transactions.
Later on you may like to try Transparent Activation and Transparent Persistence. Transparent Activation lazily loads object members when they are first accessed. Transparent Persistence automatically stores all objects that were modified in a transaction. For Transparent Activation (TA) and Transparent Persistence (TP) to work you certainly have to keep the ObjectContainer open.
You don't need to worry about constantly having an open database file. One of the key targets of db4o is embedded use in (mobile) devices. That's why we have written db4o in such a way that you can turn your machine off at any time without risking database corruption, even if the file is still open.
Possible reasons why you are getting a GenericObject back instead of a Pilot object:
This can happen when the assembly name of the assembly that contains the Pilot object has changed between two runs, either because you let VisualStudio autogenerate the name or because you changed it by hand.
Maybe "db4o" is part of your assembly name? One of the recent builds was too agressive at filtering out internal classes. This has been fixed quite some time ago. You may like to download and try the latest release, "development" or "production" should both be fine.
In a presentation I once did I have once seen really weird symptoms when db4o ObjectContainers were opened in a "using" block. You probably want to work without that anyway and keep the db4o ObjectContainer open all the time.
It is ok to reopen the database multiple times. The problem would be performance and loosing the "identity". Also you can't keep a reference to a result of a query and try to iterate it after closing the db (based on you code, looks like you want to do that).
GenericObjects are instantiated when the class cannot be found.
Can you provide a full, minimalist, sample that fails for you?
Also, which db4o version are you using?
Best

Resources