My question is not clear at title [i can not write it exactly]
e.g Texture2D picture = Content.Load<Texture2D>("myPicture");
what does happen on memory if the code above runs ? As I know Content caches the "myPicture" to the memory and return a reference to the Texture2D picture. Am I wrong ? If "myPicture" is loaded to another Texture2D object "myPicture" is not duplicated so it returns only a reference.
Is each file (or content-file) loaded over Content cached to memory (also allocated on Ram) without duplicating ? (i believe this my question with all written above should be checked)
Thanks !
Each instance of ContentManager will only load any given resource once. The second time you ask for a resource, it will return the same instance that it returned last time.
ReferenceEquals(Content.Load<Texture2D>("something"),
Content.Load<Texture2D>("something")) == true
To do this, ContentManager maintains a list of all the content it has loaded internally. This list prevents the garbage collector from cleaning up those resources - even if you are not using them.
To unload the resources and clear that internal list, call ContentManager.Unload. This will free up the memory the loaded resources were using. Now if you ask for the same resource again - it will be re-loaded.
Of course, if you are using those resources when you call Unload, all of those shared instances that you loaded will be disposed and unusable.
Finally, don't call Dispose on anything that comes out of ContentManager.Load, as this will break all the instances that are being shared and cause problems when ContentManager tries to dispose of them in Unload later on.
Related
I would like one instance of a model in memory to serve as a template for creating other objects for performance reasons, so that duplicates look like the original object but otherwise share no common components with the object they are initialized from, as if they were loaded with Model.find(template_object.id). I've tried some of the available solutions but none seems to do what I need: .dup and .deep_dup will create a new object with nil id and .clone will make some of the fields common to both the initializer and the initialized.
Currently my API is giving out the original objects that I keep as class variables, but I discovered that it leads to obscure memory leaks when the code using the objects manipulates their associations - these are kept in memory indefinitely. I hope that by giving out copies the associations of the template objects will stay untouched and the leak will be gone.
This sounds like the use case for defining a class and just initializing instances. You can customize whatever properties you want shared in the MyClass#new method. Without knowing more about your needs I will add that if you must store a template in memory you could store it as a class variable perhaps MyClass##template but would need to hear more to opine further. 😄
What I found when browsing rails source is the .instantiate method:
MyModel.instantiate(#my_other_instance.attributes_before_type_cast.deep_dup)
I have the following class:
public class CurrentOrder
{
//contains current order values which is global to all application
public static List<OrderArticleViewModel> listOfOrderArticles = new List<OrderArticleViewModel>();
public static string orderCustomerName;
public static string orderCustomerId;
public static string orderNumber;
public static string orderDateAndHour;
public static DateTime executionOrderDate = DateTime.Now.AddDays(1);
private CurrentOrder()
{
}
}
I use its fields throughout the whole application as global variables for example like that: CurrentOrder.orderNumber . When I am on certain activity and press the back button I want to clear all class fields values and I am doing it like that:
CurrentOrder.listOfOrderArticles = new List<OrderArticleViewModel>();
CurrentOrder.orderCustomerName = null;
CurrentOrder.orderCustomerId = null;
CurrentOrder.orderNumber = null;
CurrentOrder.orderDateAndHour = null;
CurrentOrder.executionOrderDate = DateTime.Now.AddDays(1);
But as far as I know the value of these fields stays in memory, the only thing is now my variables point to another place. If I click the back button 1000 times I will have 1000 times the fields in the memory nothing referencing them. I've heard that the garbage collector takes care to destroy the values that nothing is pointing at them but how often that occurs? Is it posible to press back button 100 times without the garbage collector cleaning?
There is no fixed time interval between garbage collections. Garbage collector called based upon the size of the remaining allocatable memory. Both c# and java are object-oriented language, so we don't need to allocate and release memory manually like c/c++.
Garbage collector will help developer to release memory. Xamarin.Android is using c# language, so it needs CLR to help process to manage memory(Native Android based on ART and Dalvik).
Here are conditions that when the GC will be called:
Garbage collection occurs when one of the following conditions is true:
1.The system has low physical memory. This is detected by either the low memory notification from the OS or low memory indicated by the host.
2.The memory that is used by allocated objects on the managed heap surpasses an acceptable threshold. This threshold is continuously adjusted as the process runs.
3.The GC.Collect method is called. In almost all cases, you do not have to call this method, because the garbage collector runs continuously. This method is primarily used for unique situations and testing.
And I think memory churn will prove that GC called doesn't have a fixed interval.
About your question:
Is it posible to press back button 100 times without the garbage collector cleaning?
It is based on your Android system environment( Your app is foreground or background? Is there enough memory?). But it will be gc finally.
So, about memory question, I think the memory leak and OOM( mainly due to Bitmap) should be got more attention. And memory churn also should be avoid, because it will effect Android render(UI Performance).
I'm somewhat new to Delphi, and this question is just me being curious. (I also just tried using it by accident only to discover I'm not supposed to.)
If you look at the documentation for TObject.InitInstance it tells you not to use it unless you're overriding NewInstance. The method is also public. Why not make it protected if the user is never supposed to call it?
Since I was around when this whole Delphi thing got started back around mid-1992, there are likely several answers to this question. If you look at the original declaration for TObject in Delphi 1, there weren't any protected/private members on TObject. That was because very early on in the development of Delphi and in concert with the introduction of exceptions to the language, exceptions were allocated from a different heap than other objects. This was the genesis of the NewInstance/InitInstance/CleanupInstance/FreeInstance functions. Overriding these functions on your class types you can literally control where an object is allocated.
In recent years I've used this functionality to create a cache of object instances that are literally "recycled". By intercepting NewInstance and FreeInstance, I created a system where instances are not returned to the heap upon de-allocation, rather they are placed on a lock-free/low-lock linked list. This makes allocating/freeing instances of a particular type much faster and eliminates a lot of excursions into the memory manager.
By having InitInstance public (the opposite of which is CleanupInstance), this would allow those methods to be called from other utility functions. In the above case I mentioned, InitInstance could be called on an existing block of memory without having to be called only from NewInstance. Suppose NewInstance calls a general purpose function that manages the aforementioned cache. The "scope" of the class instance is lost so the only way to call InitInstance is of it were public.
One of these days, we'll likely ship the code that does what I described above... for now it's part of an internal "research" project.
Oh, as an aside and also a bit of a history lesson... Prior to the Delphi 1 release, the design of how Exception instances were allocated/freed was returned to using the same heap as all the other objects. Because of an overall collective misstep it was assumed that we needed to allocate all Exception object instances to "protect" the Out of memory case. We reasoned that if we try and raise an exception because the memory manager was "out of memory", how in the blazes would we allocate the exception instance!? We already know there is no memory at that point! So we decided that a separate heap was necessary for all exceptions... until either Chuck Jazdzewski or Anders Heijlsberg (I forget exactly which one), figured out a simple, rather clever solution... Just pre-allocate the out of memory exception on startup! We still needed to control whether or not the exception should ever actually be freed (Exception instances are automatically freed once handled), so the whole NewInstance/FreeInstance mechanism remained.
Well never say never. In the VCL too much stuff is private and not virtual as it is, so I kinda like the fact that this stuff is public.
It isn't really necessary for normal use, but in specific cases, you might use it to allocate objects in bulk. NewInstance reserves a bit of memory for the object and then calls InitInstance to initialize it. You could write a piece of code that allocates memory for a great number of objects in one go, and then calls InitInstance for different parts of that large block to initialize different blocks in it. Such an implementation could be the base for a flyweight pattern implementation.
Normally you wouln't need such a thing at all, but it's nice that you can if you really want/need to.
How it works?
The fun thing is: a constructor in Delphi is just some method. The Create method itself doesn't do anything special. If you look at it, it is just a method as any other. It's even empty in TObject!
You can even call it on an instance (call MyObject.Create instead of TMyObject.Create), and it won't return a new object at all. The key is in the constructor keyword. That tells the compiler, that before executing the TAnyClass.Create method, it should also construct an actual object instance.
That construction means basically calling NewInstance. NewInstance allocates a piece of memory for the data of the object. After that, it calls InitInstance to do some special initialization of that memory, starting with clearing it (filling with zeroes).
Allocating memory is a relatively expensive task. A memory manager (compiled into your application) needs to find a free piece of memory and assign it to your object. If it doesn't have enough memory available, it needs to make a request to Windows to give it some more. If you have thousands or even millions of objects to create, then this can be inefficient.
In those rare cases, you could decide to allocate the memory for all those objects in one go. In that case you won't call the constructor at all, because you don't want to call NewInstance (because it would allocate extra memory). Instead, you can call InitInstance yourself to initialize pieces of your big chunk of memory.
Anyway, this is just a hypotheses of the reason. Maybe there isn't a reason at all. I've seen so many irrationally applied visibility levels in the VCL. Maybe they just didn't think about it at all. ;)
It gives developers a way to create object not using NewInstance (memory from stack/memory pool)
My guess would be in the ABAP memory from the main session, but I'm not sure and cannot find anything in the documentation. Does anyone know for sure?
Check this article for the basic memory layout and terminology, unless you already have done so. The static attributes of a class are handled the same way the global variables of a function pool are (you might think of them as global variables of the class pool, but don't hit me too hard for that analogy). Whenever you open a new internal session (e. g. with SUBMIT), they are reinitialized. You could try to check this with a small program that recursively calls itself using SUBMIT ... AND RETURN for yourself.
I'm having trouble getting the windsor container and entity framework working together and it may be due to a problem I've introduced myself but the net result is that I'm getting a terrible memory leaks caused.
I have my application set up with an EDMX and Repositories and Services and those and the objectcontext are set to perwebrequest in the windsor configuration file I use. However when I look at the memory usage in ANTS memory profiler I see that the object context cache is still being held onto as a reference with the cache despite confirming that Dispose has been called.
And each request more dynamic proxies are getting stuck in memory. If anyone else has managed to get themselves in a pickle like this and can offer me advice to get out of it, it would be greatly appreciated.
I've managed to track down and resolve the problem by changing the release settings on the kernal for the windsor container to:
_container.Kernel.ReleasePolicy = new NoTrackingReleasePolicy();
It seems although the windsor container calls the dispose method of perwebrequest components it still hangs on to a reference of them which prevents them being garbage collected.
In this case the object it was holding a reference to was of type ObjectContext. Unfortunately despite disposing of this object the all the dynamic proxies cached in this object still remain effectively meaning that a copy of my database (or at least the parts I was accessing) was being added to memory each request causing it to ramp up.
You are probably not disposing objects correctly. Try using "Using" blocks.
Cannot say much more without seeing the code.
I had the same problem.
After investigation it seemed that I was missing a call to _container.Release(controller) in my Controller Factory:
public override void ReleaseController(IController controller)
{
_container.Release(controller);
var disposable = controller as IDisposable;
if (disposable != null)
{
disposable.Dispose();
}
}
However, I was also using Windsor 2.1 and adding _container.Release(controller) did not do anything for me.
After updating to v3.1 it seems to work.
Hope this helps.
p.s. ANTS Memory Profiler - lifesaver!