I build an iPad application and profile it with XCode, I got memory leaks in Security framework, following is some of them from Instruments:
Leaked Object # Address Size Responsible Library Responsible Frame
Malloc 128 Bytes,4 < multiple > 512 Bytes Security mp_init
Malloc 128 Bytes, 0x824aa70 128 Bytes Security mp_init
Malloc 128 Bytes, 0x824a9f0 128 Bytes Security mp_init
Malloc 128 Bytes, 0x824a970 128 Bytes Security mp_init
Malloc 128 Bytes, 0x824a8f0 128 Bytes Security mp_init
Leaked Object # Address Size Responsible Library Responsible Frame
Malloc 16 Bytes,4 < multiple > 64 Bytes Security init
Malloc 16 Bytes, 0x824a550 16 Bytes Security init
Malloc 16 Bytes, 0x824a540 16 Bytes Security init
Malloc 16 Bytes, 0x82493d0 16 Bytes Security init
Malloc 16 Bytes, 0x8237ca0 16 Bytes Security init
and some of the details:
# Category Event Type Timestamp RefCt Address Size Responsible Library Responsible Caller
0 Malloc 128 Bytes Malloc 00:02.240.888 1 0x824aa70 128 Security mp_init
# Category Event Type Timestamp RefCt Address Size Responsible Library Responsible Caller
0 Malloc 16 Bytes Malloc 00:02.240.886 1 0x824a550 16 Security init
I use ASIHTTP, FMDatabase and some other frameworks(I do not think they will cause the problem) in my application, and no Security framework in "Link Binary With Libraries", so I have no idea what's going wrong.
Can anyone figure out what may cause the leaks?
Related
I am trying to solve the below question:
filter the UDP packets having a size equal to 242 bytes.
I looked to this answer udp.length==209 set a filter of packet length in wireshark, but instead of getting packets with length 209 bytes I get packets with length 243 bytes.
screenshot. can anyone explain?
Your image shows a packet like
Frame 243 bytes
'-> Ethernet
'-> IPv4
'-> UDP
'-> Dropbox LAN Sync
Ethernet will be 14 bytes with 6 per src/dst MAC address and 2 bytes for Ethertype.
The IPv4 header will be a minimum of 20 bytes, but could be more with options. It just so happens to be 20 here.
Eth 14 bytes + IP 20 bytes = 34 bytes
243 bytes - 34 bytes of Eth/IP = 209 bytes of UDP data
I've seen a fair bit of noise about "false positives," and have even encountered them, myself.
However, this takes the cake.
Easy to reproduce, using Swift 5/Xcode 10.2, create a new single-view iOS app.
Run Leaks.
You get these critters:
Malloc 64 Bytes 1 0x600001d084c0 64 Bytes Foundation +[NSString stringWithUTF8String:]
Malloc 16 Bytes 3 < multiple > 48 Bytes
Malloc 1.50 KiB 3 < multiple > 4.50 KiB
Malloc 32 Bytes 3 < multiple > 96 Bytes
Malloc 8.00 KiB 1 0x7fc56f000c00 8.00 KiB
Malloc 64 Bytes 10 < multiple > 640 Bytes
Malloc 80 Bytes 3 < multiple > 240 Bytes
Malloc 4.00 KiB 3 < multiple > 12.00 KiB
Using the simulator (XR, iOS 12.2).
That first one has a stack trace, but it's worthless.
Is there some way that I can correct for this noise? I'm writing an infrastructure component, and I need to:
A) Make damn sure it doesn't leak, and
B) Not have every Cocoapod jockey on Earth emailing me, and telling me that my component leaks.
If using a iOS 12.1 simulator , the leak instrument still can work (Swift 5/Xcode 10.2). Currently we are hoping it will be fixed in future versions.
My memory is DDR2 800MHz
dmidecode -t memory
Handle 0x1100, DMI type 17, 27 bytes
Memory Device
Array Handle: 0x1000
Error Information Handle: Not Provided
Total Width: 64 bits
Data Width: 64 bits
Size: 1024 MB
Form Factor: DIMM
Set: None
Locator: DIMM_1
Bank Locator: Not Specified
Type: DDR2
Type Detail: Synchronous
Speed: 800 MHz (1.2 ns)
Manufacturer: 7F98000000000000
Serial Number: 2DCCDD00
Asset Tag: 050916
Part Number:
I want to estimate the write or read speed by this info.
Is 800MHz means read/write 800bits per second?
Should we multiple 64 bit datawidth? eg: 800*64 bits/sec. So we can read/write 800*64/8 bytes/sec?
Thanks advance!
I found this question in one of my previous exam papers and I am not really sure if I got the right answer to it. As far as I see 2^15 is 32768 which is 32 MB so the answer could be 15 bits. But I think I'm missing something here?
32768 bytes is not 32 Mb.
32 Mb = 32 * 1024Kb = 32 * 1024 * 1024 bytes = 2^5 * 2^10 * 2^10 = 2^25
That is, 33.554.432 bytes = 32 Mb.
So you will need, at least 25 bits to address a single byte in that memory scheme.
Yes, some powers of 10. 32768<>32MB
1M is 2^20, 32 is 2^5, so you need 25 bits.
Since 1MB = 10^6 bytes i.e. 2^20 bytes for 32 MB we have:
32 = 2^5 bytes
1MB = 2^20 bytes so,
32MB = 2^5 * 2^20 = 2^25 bytes,
BUT the question asks "How many address bits..." not bytes, therefore we multiply by 8 = 2^3 (because 1byte = 8bits), that is
32 Mbytes = 2^5 * 2^20 *2^3 = 2^28
Thus, 28 bits are needed.
I have written a code in OpenGL-ES 2.0 with PVRSDK in Ubuntu 10.10 , now the thing is that whatever output I want , I am getting that but it comes and then the window disappears ,
If I put a break-point I am getting what I want. But I don't understand why the window disappears .
When I do the memory check with valgrind I got these errors :
==5997== Memcheck, a memory error detector
==5997== Copyright (C) 2002-2010, and GNU GPL'd, by Julian Seward et al.
==5997== Using Valgrind-3.6.0.SVN-Debian and LibVEX; rerun with -h for copyright info
==5997== Command: ./cube
==5997==
libEGL warning: use software fallback
==5997==
==5997== HEAP SUMMARY:
==5997== in use at exit: 12,126 bytes in 94 blocks
==5997== total heap usage: 97,499 allocs, 97,405 frees, 197,353,970 bytes allocated
==5997==
==5997== 8 bytes in 1 blocks are definitely lost in loss record 3 of 57
==5997== at 0x4024F12: calloc (vg_replace_malloc.c:467)
==5997== by 0x486CE90: __glXInitialize (glxinit.c:584)
==5997== by 0x486AA61: x11_screen_support (x11_screen.c:133)
==5997== by 0x486DEEB: x11_create_dri2_display (native_dri2.c:700)
==5997== by 0x4869AEA: native_create_display (native_x11.c:42)
==5997== by 0x4866A2A: egl_g3d_initialize (egl_g3d.c:498)
==5997== by 0x4176F36: _eglMatchDriver (egldriver.c:580)
==5997== by 0x4170BBF: eglInitialize (eglapi.c:294)
==5997== by 0x8049EFF: main (Hello.cpp:231)
==5997==
==5997== 40 bytes in 1 blocks are definitely lost in loss record 31 of 57
==5997== at 0x4026351: operator new(unsigned int) (vg_replace_malloc.c:255)
==5997== by 0x8049CE3: main (Hello.cpp:206)
==5997==
==5997== 72 bytes in 1 blocks are definitely lost in loss record 34 of 57
==5997== at 0x4024F12: calloc (vg_replace_malloc.c:467)
==5997== by 0x4A50C7C: st_bufferobj_alloc (st_cb_bufferobjects.c:56)
==5997== by 0x49A1B78: _mesa_alloc_shared_state (shared.c:94)
==5997== by 0x498860F: _mesa_initialize_context_for_api (context.c:904)
==5997== by 0x49886DB: _mesa_create_context_for_api (context.c:1050)
==5997== by 0x49C7479: st_create_context (st_context.c:176)
==5997== by 0x4985A80: st_api_create_context (st_manager.c:646)
==5997== by 0x4868843: egl_g3d_create_context (egl_g3d_api.c:131)
==5997== by 0x4173127: eglCreateContext (eglapi.c:413)
==5997== by 0x804A039: main (Hello.cpp:261)
==5997==
==5997== 1,546 (48 direct, 1,498 indirect) bytes in 1 blocks are definitely lost in loss record 55 of 57
==5997== at 0x4025BD3: malloc (vg_replace_malloc.c:236)
==5997== by 0x4BABD13: talloc_enable_null_tracking (in /usr/lib/libtalloc.so.2.0.1)
==5997== by 0x4BABE28: talloc_init (in /usr/lib/libtalloc.so.2.0.1)
==5997== by 0x49F48B5: glsl_symbol_table::glsl_symbol_table() (glsl_symbol_table.cpp:60)
==5997== by 0x49F2566: _mesa_glsl_parse_state::_mesa_glsl_parse_state(__GLcontextRec*, unsigned int, void*) (glsl_parser_extras.cpp:50)
==5997== by 0x49D68B3: _mesa_glsl_compile_shader (ir_to_mesa.cpp:2815)
==5997== by 0x49A0486: _mesa_CompileShaderARB (shaderapi.c:807)
==5997== by 0x804A0E1: main (Hello.cpp:280)
==5997==
==5997== LEAK SUMMARY:
==5997== definitely lost: 168 bytes in 4 blocks
==5997== indirectly lost: 1,498 bytes in 28 blocks
==5997== possibly lost: 0 bytes in 0 blocks
==5997== still reachable: 10,460 bytes in 62 blocks
==5997== suppressed: 0 bytes in 0 blocks
==5997== Reachable blocks (those to which a pointer was found) are not shown.
==5997== To see them, rerun with: --leak-check=full --show-reachable=yes
==5997==
==5997== For counts of detected and suppressed errors, rerun with: -v
==5997== ERROR SUMMARY: 4 errors from 4 contexts (suppressed: 54 from 11)
Now those lines which they are mentioning :
206 x11Visual = new XVisualInfo;
207 XMatchVisualInfo( x11Display, x11Screen, i32Depth, TrueColor, x11Vis ual);
208 if (!x11Visual)
209 {
210 printf("Error: Unable to acquire visual\n");
211 goto cleanup;
212 }
===========================================================================================
230 EGLint iMajorVersion, iMinorVersion;
231 if (!eglInitialize(eglDisplay, &iMajorVersion, &iMinorVersion))
232 {
233 printf("Error: eglInitialize() failed.\n");
234 }
==========================================================================================
eglContext = eglCreateContext(eglDisplay, eglConfig, NULL, ai32Conte xtAttribs);
262 if (!TestEGLError("eglCreateContext"))
263 {
264 goto cleanup;
265 }
============================================================================================
279 glShaderSource(uiFragShader, 1, (const char**)&pszFragShader, NULL);
280 glCompileShader(uiFragShader);
So I just want to know that its because of these errors I am getting that screen disappears or something else might be the reason .
I just want to know that its because of these errors I am getting that screen disappears or something else might be the reason
The latter. Leaks do not affect program runtime (unless they are so large that the program fails to allocate more memory when it needs it; but your leaks are small).
Did you forget to enter the X event loop?
Update:
int i32NumMessages = XPending( x11Display );
for( int i = 0; i < i32NumMessages; i++ )
{
XEvent event;
XNextEvent( x11Display, &event );
}
That loop is in fact
obviously wrong
the reason your application (almost) immediately exits
You are processing only the events queued so far (which is likely a very small number of events). And you are actually discarding them. You'll probably want to write something like this instead.