Pytorch model Parameters changes in CPU and GPU - machine-learning

I have created the model and save the weights using google colab. Now I have created a prediction script.
The prediction script contains the model class. I am trying to load the model weights using the following method-
Saving & Loading Model Across Devices
Save on GPU, Load on CPU
Save:
torch.save(model.state_dict(), PATH)
Load:
device = torch.device('cpu')
model = TheModelClass(*args, **kwargs)
model.load_state_dict(torch.load(PATH, map_location=device))
The above method should work, right? Yes.
But when I am trying to do so I have different parameters of the model in Google Colab (Prediction, runtime-None, device=CPU) and different in my local machine (prediction, device=cpu)
Model Params in Colab-
def count_parameters(model):
return sum(p.numel() for p in model.parameters() if p.requires_grad)
print(f'The model has {count_parameters(model):,} trainable parameters')
The model has 12,490,234 trainable parameters
+-------------------------------------------------------+------------+
| Modules | Parameters |
+-------------------------------------------------------+------------+
| encoder.tok_embedding.weight | 2053376 |
| encoder.pos_embedding.weight | 25600 |
| encoder.layers.0.self_attn_layer_norm.weight | 256 |
| encoder.layers.0.self_attn_layer_norm.bias | 256 |
| encoder.layers.0.ff_layer_norm.weight | 256 |
| encoder.layers.0.ff_layer_norm.bias | 256 |
| encoder.layers.0.self_attention.fc_q.weight | 65536 |
| encoder.layers.0.self_attention.fc_q.bias | 256 |
| encoder.layers.0.self_attention.fc_k.weight | 65536 |
| encoder.layers.0.self_attention.fc_k.bias | 256 |
| encoder.layers.0.self_attention.fc_v.weight | 65536 |
| encoder.layers.0.self_attention.fc_v.bias | 256 |
| encoder.layers.0.self_attention.fc_o.weight | 65536 |
| encoder.layers.0.self_attention.fc_o.bias | 256 |
| encoder.layers.0.positionwise_feedforward.fc_1.weight | 131072 |
| encoder.layers.0.positionwise_feedforward.fc_1.bias | 512 |
| encoder.layers.0.positionwise_feedforward.fc_2.weight | 131072 |
| encoder.layers.0.positionwise_feedforward.fc_2.bias | 256 |
| encoder.layers.1.self_attn_layer_norm.weight | 256 |
| encoder.layers.1.self_attn_layer_norm.bias | 256 |
| encoder.layers.1.ff_layer_norm.weight | 256 |
| encoder.layers.1.ff_layer_norm.bias | 256 |
| encoder.layers.1.self_attention.fc_q.weight | 65536 |
| encoder.layers.1.self_attention.fc_q.bias | 256 |
| encoder.layers.1.self_attention.fc_k.weight | 65536 |
| encoder.layers.1.self_attention.fc_k.bias | 256 |
| encoder.layers.1.self_attention.fc_v.weight | 65536 |
| encoder.layers.1.self_attention.fc_v.bias | 256 |
| encoder.layers.1.self_attention.fc_o.weight | 65536 |
| encoder.layers.1.self_attention.fc_o.bias | 256 |
| encoder.layers.1.positionwise_feedforward.fc_1.weight | 131072 |
| encoder.layers.1.positionwise_feedforward.fc_1.bias | 512 |
| encoder.layers.1.positionwise_feedforward.fc_2.weight | 131072 |
| encoder.layers.1.positionwise_feedforward.fc_2.bias | 256 |
| encoder.layers.2.self_attn_layer_norm.weight | 256 |
| encoder.layers.2.self_attn_layer_norm.bias | 256 |
| encoder.layers.2.ff_layer_norm.weight | 256 |
| encoder.layers.2.ff_layer_norm.bias | 256 |
| encoder.layers.2.self_attention.fc_q.weight | 65536 |
| encoder.layers.2.self_attention.fc_q.bias | 256 |
| encoder.layers.2.self_attention.fc_k.weight | 65536 |
| encoder.layers.2.self_attention.fc_k.bias | 256 |
| encoder.layers.2.self_attention.fc_v.weight | 65536 |
| encoder.layers.2.self_attention.fc_v.bias | 256 |
| encoder.layers.2.self_attention.fc_o.weight | 65536 |
| encoder.layers.2.self_attention.fc_o.bias | 256 |
| encoder.layers.2.positionwise_feedforward.fc_1.weight | 131072 |
| encoder.layers.2.positionwise_feedforward.fc_1.bias | 512 |
| encoder.layers.2.positionwise_feedforward.fc_2.weight | 131072 |
| encoder.layers.2.positionwise_feedforward.fc_2.bias | 256 |
| decoder.tok_embedding.weight | 3209728 |
| decoder.pos_embedding.weight | 25600 |
| decoder.layers.0.self_attn_layer_norm.weight | 256 |
| decoder.layers.0.self_attn_layer_norm.bias | 256 |
| decoder.layers.0.enc_attn_layer_norm.weight | 256 |
| decoder.layers.0.enc_attn_layer_norm.bias | 256 |
| decoder.layers.0.ff_layer_norm.weight | 256 |
| decoder.layers.0.ff_layer_norm.bias | 256 |
| decoder.layers.0.self_attention.fc_q.weight | 65536 |
| decoder.layers.0.self_attention.fc_q.bias | 256 |
| decoder.layers.0.self_attention.fc_k.weight | 65536 |
| decoder.layers.0.self_attention.fc_k.bias | 256 |
| decoder.layers.0.self_attention.fc_v.weight | 65536 |
| decoder.layers.0.self_attention.fc_v.bias | 256 |
| decoder.layers.0.self_attention.fc_o.weight | 65536 |
| decoder.layers.0.self_attention.fc_o.bias | 256 |
| decoder.layers.0.encoder_attention.fc_q.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_q.bias | 256 |
| decoder.layers.0.encoder_attention.fc_k.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_k.bias | 256 |
| decoder.layers.0.encoder_attention.fc_v.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_v.bias | 256 |
| decoder.layers.0.encoder_attention.fc_o.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_o.bias | 256 |
| decoder.layers.0.positionwise_feedforward.fc_1.weight | 131072 |
| decoder.layers.0.positionwise_feedforward.fc_1.bias | 512 |
| decoder.layers.0.positionwise_feedforward.fc_2.weight | 131072 |
| decoder.layers.0.positionwise_feedforward.fc_2.bias | 256 |
| decoder.layers.1.self_attn_layer_norm.weight | 256 |
| decoder.layers.1.self_attn_layer_norm.bias | 256 |
| decoder.layers.1.enc_attn_layer_norm.weight | 256 |
| decoder.layers.1.enc_attn_layer_norm.bias | 256 |
| decoder.layers.1.ff_layer_norm.weight | 256 |
| decoder.layers.1.ff_layer_norm.bias | 256 |
| decoder.layers.1.self_attention.fc_q.weight | 65536 |
| decoder.layers.1.self_attention.fc_q.bias | 256 |
| decoder.layers.1.self_attention.fc_k.weight | 65536 |
| decoder.layers.1.self_attention.fc_k.bias | 256 |
| decoder.layers.1.self_attention.fc_v.weight | 65536 |
| decoder.layers.1.self_attention.fc_v.bias | 256 |
| decoder.layers.1.self_attention.fc_o.weight | 65536 |
| decoder.layers.1.self_attention.fc_o.bias | 256 |
| decoder.layers.1.encoder_attention.fc_q.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_q.bias | 256 |
| decoder.layers.1.encoder_attention.fc_k.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_k.bias | 256 |
| decoder.layers.1.encoder_attention.fc_v.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_v.bias | 256 |
| decoder.layers.1.encoder_attention.fc_o.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_o.bias | 256 |
| decoder.layers.1.positionwise_feedforward.fc_1.weight | 131072 |
| decoder.layers.1.positionwise_feedforward.fc_1.bias | 512 |
| decoder.layers.1.positionwise_feedforward.fc_2.weight | 131072 |
| decoder.layers.1.positionwise_feedforward.fc_2.bias | 256 |
| decoder.layers.2.self_attn_layer_norm.weight | 256 |
| decoder.layers.2.self_attn_layer_norm.bias | 256 |
| decoder.layers.2.enc_attn_layer_norm.weight | 256 |
| decoder.layers.2.enc_attn_layer_norm.bias | 256 |
| decoder.layers.2.ff_layer_norm.weight | 256 |
| decoder.layers.2.ff_layer_norm.bias | 256 |
| decoder.layers.2.self_attention.fc_q.weight | 65536 |
| decoder.layers.2.self_attention.fc_q.bias | 256 |
| decoder.layers.2.self_attention.fc_k.weight | 65536 |
| decoder.layers.2.self_attention.fc_k.bias | 256 |
| decoder.layers.2.self_attention.fc_v.weight | 65536 |
| decoder.layers.2.self_attention.fc_v.bias | 256 |
| decoder.layers.2.self_attention.fc_o.weight | 65536 |
| decoder.layers.2.self_attention.fc_o.bias | 256 |
| decoder.layers.2.encoder_attention.fc_q.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_q.bias | 256 |
| decoder.layers.2.encoder_attention.fc_k.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_k.bias | 256 |
| decoder.layers.2.encoder_attention.fc_v.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_v.bias | 256 |
| decoder.layers.2.encoder_attention.fc_o.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_o.bias | 256 |
| decoder.layers.2.positionwise_feedforward.fc_1.weight | 131072 |
| decoder.layers.2.positionwise_feedforward.fc_1.bias | 512 |
| decoder.layers.2.positionwise_feedforward.fc_2.weight | 131072 |
| decoder.layers.2.positionwise_feedforward.fc_2.bias | 256 |
| decoder.fc_out.weight | 3209728 |
| decoder.fc_out.bias | 12538 |
+-------------------------------------------------------+------------+
Total Trainable Params: 12490234
Model Params in Local-
def count_parameters(model):
return sum(p.numel() for p in model.parameters() if p.requires_grad)
print(f'The model has {count_parameters(model):,} trainable parameters')
The model has 12,506,137 trainable parameters
+-------------------------------------------------------+------------+
| Modules | Parameters |
+-------------------------------------------------------+------------+
| encoder.tok_embedding.weight | 2053376 |
| encoder.pos_embedding.weight | 25600 |
| encoder.layers.0.self_attn_layer_norm.weight | 256 |
| encoder.layers.0.self_attn_layer_norm.bias | 256 |
| encoder.layers.0.ff_layer_norm.weight | 256 |
| encoder.layers.0.ff_layer_norm.bias | 256 |
| encoder.layers.0.self_attention.fc_q.weight | 65536 |
| encoder.layers.0.self_attention.fc_q.bias | 256 |
| encoder.layers.0.self_attention.fc_k.weight | 65536 |
| encoder.layers.0.self_attention.fc_k.bias | 256 |
| encoder.layers.0.self_attention.fc_v.weight | 65536 |
| encoder.layers.0.self_attention.fc_v.bias | 256 |
| encoder.layers.0.self_attention.fc_o.weight | 65536 |
| encoder.layers.0.self_attention.fc_o.bias | 256 |
| encoder.layers.0.positionwise_feedforward.fc_1.weight | 131072 |
| encoder.layers.0.positionwise_feedforward.fc_1.bias | 512 |
| encoder.layers.0.positionwise_feedforward.fc_2.weight | 131072 |
| encoder.layers.0.positionwise_feedforward.fc_2.bias | 256 |
| encoder.layers.1.self_attn_layer_norm.weight | 256 |
| encoder.layers.1.self_attn_layer_norm.bias | 256 |
| encoder.layers.1.ff_layer_norm.weight | 256 |
| encoder.layers.1.ff_layer_norm.bias | 256 |
| encoder.layers.1.self_attention.fc_q.weight | 65536 |
| encoder.layers.1.self_attention.fc_q.bias | 256 |
| encoder.layers.1.self_attention.fc_k.weight | 65536 |
| encoder.layers.1.self_attention.fc_k.bias | 256 |
| encoder.layers.1.self_attention.fc_v.weight | 65536 |
| encoder.layers.1.self_attention.fc_v.bias | 256 |
| encoder.layers.1.self_attention.fc_o.weight | 65536 |
| encoder.layers.1.self_attention.fc_o.bias | 256 |
| encoder.layers.1.positionwise_feedforward.fc_1.weight | 131072 |
| encoder.layers.1.positionwise_feedforward.fc_1.bias | 512 |
| encoder.layers.1.positionwise_feedforward.fc_2.weight | 131072 |
| encoder.layers.1.positionwise_feedforward.fc_2.bias | 256 |
| encoder.layers.2.self_attn_layer_norm.weight | 256 |
| encoder.layers.2.self_attn_layer_norm.bias | 256 |
| encoder.layers.2.ff_layer_norm.weight | 256 |
| encoder.layers.2.ff_layer_norm.bias | 256 |
| encoder.layers.2.self_attention.fc_q.weight | 65536 |
| encoder.layers.2.self_attention.fc_q.bias | 256 |
| encoder.layers.2.self_attention.fc_k.weight | 65536 |
| encoder.layers.2.self_attention.fc_k.bias | 256 |
| encoder.layers.2.self_attention.fc_v.weight | 65536 |
| encoder.layers.2.self_attention.fc_v.bias | 256 |
| encoder.layers.2.self_attention.fc_o.weight | 65536 |
| encoder.layers.2.self_attention.fc_o.bias | 256 |
| encoder.layers.2.positionwise_feedforward.fc_1.weight | 131072 |
| encoder.layers.2.positionwise_feedforward.fc_1.bias | 512 |
| encoder.layers.2.positionwise_feedforward.fc_2.weight | 131072 |
| encoder.layers.2.positionwise_feedforward.fc_2.bias | 256 |
| decoder.tok_embedding.weight | 3217664 |
| decoder.pos_embedding.weight | 25600 |
| decoder.layers.0.self_attn_layer_norm.weight | 256 |
| decoder.layers.0.self_attn_layer_norm.bias | 256 |
| decoder.layers.0.enc_attn_layer_norm.weight | 256 |
| decoder.layers.0.enc_attn_layer_norm.bias | 256 |
| decoder.layers.0.ff_layer_norm.weight | 256 |
| decoder.layers.0.ff_layer_norm.bias | 256 |
| decoder.layers.0.self_attention.fc_q.weight | 65536 |
| decoder.layers.0.self_attention.fc_q.bias | 256 |
| decoder.layers.0.self_attention.fc_k.weight | 65536 |
| decoder.layers.0.self_attention.fc_k.bias | 256 |
| decoder.layers.0.self_attention.fc_v.weight | 65536 |
| decoder.layers.0.self_attention.fc_v.bias | 256 |
| decoder.layers.0.self_attention.fc_o.weight | 65536 |
| decoder.layers.0.self_attention.fc_o.bias | 256 |
| decoder.layers.0.encoder_attention.fc_q.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_q.bias | 256 |
| decoder.layers.0.encoder_attention.fc_k.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_k.bias | 256 |
| decoder.layers.0.encoder_attention.fc_v.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_v.bias | 256 |
| decoder.layers.0.encoder_attention.fc_o.weight | 65536 |
| decoder.layers.0.encoder_attention.fc_o.bias | 256 |
| decoder.layers.0.positionwise_feedforward.fc_1.weight | 131072 |
| decoder.layers.0.positionwise_feedforward.fc_1.bias | 512 |
| decoder.layers.0.positionwise_feedforward.fc_2.weight | 131072 |
| decoder.layers.0.positionwise_feedforward.fc_2.bias | 256 |
| decoder.layers.1.self_attn_layer_norm.weight | 256 |
| decoder.layers.1.self_attn_layer_norm.bias | 256 |
| decoder.layers.1.enc_attn_layer_norm.weight | 256 |
| decoder.layers.1.enc_attn_layer_norm.bias | 256 |
| decoder.layers.1.ff_layer_norm.weight | 256 |
| decoder.layers.1.ff_layer_norm.bias | 256 |
| decoder.layers.1.self_attention.fc_q.weight | 65536 |
| decoder.layers.1.self_attention.fc_q.bias | 256 |
| decoder.layers.1.self_attention.fc_k.weight | 65536 |
| decoder.layers.1.self_attention.fc_k.bias | 256 |
| decoder.layers.1.self_attention.fc_v.weight | 65536 |
| decoder.layers.1.self_attention.fc_v.bias | 256 |
| decoder.layers.1.self_attention.fc_o.weight | 65536 |
| decoder.layers.1.self_attention.fc_o.bias | 256 |
| decoder.layers.1.encoder_attention.fc_q.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_q.bias | 256 |
| decoder.layers.1.encoder_attention.fc_k.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_k.bias | 256 |
| decoder.layers.1.encoder_attention.fc_v.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_v.bias | 256 |
| decoder.layers.1.encoder_attention.fc_o.weight | 65536 |
| decoder.layers.1.encoder_attention.fc_o.bias | 256 |
| decoder.layers.1.positionwise_feedforward.fc_1.weight | 131072 |
| decoder.layers.1.positionwise_feedforward.fc_1.bias | 512 |
| decoder.layers.1.positionwise_feedforward.fc_2.weight | 131072 |
| decoder.layers.1.positionwise_feedforward.fc_2.bias | 256 |
| decoder.layers.2.self_attn_layer_norm.weight | 256 |
| decoder.layers.2.self_attn_layer_norm.bias | 256 |
| decoder.layers.2.enc_attn_layer_norm.weight | 256 |
| decoder.layers.2.enc_attn_layer_norm.bias | 256 |
| decoder.layers.2.ff_layer_norm.weight | 256 |
| decoder.layers.2.ff_layer_norm.bias | 256 |
| decoder.layers.2.self_attention.fc_q.weight | 65536 |
| decoder.layers.2.self_attention.fc_q.bias | 256 |
| decoder.layers.2.self_attention.fc_k.weight | 65536 |
| decoder.layers.2.self_attention.fc_k.bias | 256 |
| decoder.layers.2.self_attention.fc_v.weight | 65536 |
| decoder.layers.2.self_attention.fc_v.bias | 256 |
| decoder.layers.2.self_attention.fc_o.weight | 65536 |
| decoder.layers.2.self_attention.fc_o.bias | 256 |
| decoder.layers.2.encoder_attention.fc_q.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_q.bias | 256 |
| decoder.layers.2.encoder_attention.fc_k.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_k.bias | 256 |
| decoder.layers.2.encoder_attention.fc_v.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_v.bias | 256 |
| decoder.layers.2.encoder_attention.fc_o.weight | 65536 |
| decoder.layers.2.encoder_attention.fc_o.bias | 256 |
| decoder.layers.2.positionwise_feedforward.fc_1.weight | 131072 |
| decoder.layers.2.positionwise_feedforward.fc_1.bias | 512 |
| decoder.layers.2.positionwise_feedforward.fc_2.weight | 131072 |
| decoder.layers.2.positionwise_feedforward.fc_2.bias | 256 |
| decoder.fc_out.weight | 3217664 |
| decoder.fc_out.bias | 12569 |
+-------------------------------------------------------+------------+
Total Trainable Params: 12506137
So, that's why I am unable to load the model. Because the model has a different parameter in local.
Even if I try to load the weights in local it gives me-
model.load_state_dict(torch.load(f"{model_name}.pt", map_location=device))
Error-
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) <ipython-input-24-f5baac4441a5> in <module>
----> 1 model.load_state_dict(torch.load(f"{model_name}_2.pt", map_location=device))
c:\anaconda\envs\lang_trans\lib\site-packages\torch\nn\modules\module.py in load_state_dict(self, state_dict, strict)
845 if len(error_msgs) > 0:
846 raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
--> 847 self.__class__.__name__, "\n\t".join(error_msgs)))
848 return _IncompatibleKeys(missing_keys, unexpected_keys)
849
RuntimeError: Error(s) in loading state_dict for Seq2Seq: size mismatch for decoder.tok_embedding.weight: copying a param with shape torch.Size([12538, 256]) from checkpoint, the shape in current model is torch.Size([12569, 256]). size mismatch for decoder.fc_out.weight: copying a param with shape torch.Size([12538, 256]) from checkpoint, the shape in current model is torch.Size([12569, 256]). size mismatch for decoder.fc_out.bias: copying a param with shape torch.Size([12538]) from checkpoint, the shape in current model is torch.Size([12569]).
The model param of the local must be wrong because in colab (device=CPU, runtime=None) I am able to load the weights after defining model class. But in the local machine the params changes, so I am unable to load the weights. I know it's weird, help me to find the solution.
You can check the full code of the model here-
<script src="https://gist.github.com/Dipeshpal/90c715a7b7f00845e20ef998bda35835.js"></script>
https://gist.github.com/Dipeshpal/90c715a7b7f00845e20ef998bda35835
After this model params change.

Related

MemSQL takes 15GB memory for 10MB of data

I have installed memsql 5.1.2 in following manner with following resources.
Google cloud server
HDD: 100GB
Machine type: n1-standard-4 (4 vCPUs, 15 GB memory)
Implementation:
2 MEMSQL NODES running on same machine on the following ports
3306 Master Aggregator
3307 Leaf
Resource Utilization:
Memory 14.16 GB / 14.69 GB
Paging 0 B/s
Database size - 10MB
1818 memsql 1.1% 77% /var/lib/memsql/leaf-3307/memsqld --defaults-file=/var/lib/memsql/leaf-3307/memsql.cnf --pid-file=/var/lib/memsql/leaf-3307/data/memsqld.pid --user=memsql
2736 memsql 0.3% 16% /var/lib/memsql/master-3306/memsqld --defaults-file=/var/lib/memsql/master-330
Note: There is no Swap memory implemented in the server.
Database size is taken by running a query on information_schema.TABLES.
All data resides as row store since we have to run queries by considering many relationships among tables.
As soon as the memsql is up the memory goes up to 70% and it keep on increasing and after 2-3 hours memsql gives the following error when try connect with it and connection also can not be done after that.
OperationalError: (1836, "Leaf 'xx.xxx.x.xx':3307 failed while executing this query. Try re-running the query.")
[Mon Mar 27 09:26:31.163455 2017] [:error] [pid 1718] [remote xxx.xxx.xxx.xxx:9956]
The only solution is to restart the server since it has taken up all the memory.
What I can do for this? Is there an issue in the way it's implemented? Any logs should I attach here?
Show status extended; query gives the following result
+-------------------------------------+------------------------------------------------------------------------+
| Variable_name | Value |
+-------------------------------------+------------------------------------------------------------------------+
| Aborted_clients | 48 |
| Aborted_connects | 1 |
| Bytes_received | 85962135 |
| Bytes_sent | 545322701 |
| Connections | 1626 |
| Max_used_connections | 69 |
| Queries | 364793 |
| Questions | 364793 |
| Threads_cached | 19 |
| Threads_connected | 50 |
| Threads_created | 69 |
| Threads_running | 1 |
| Threads_background | 1 |
| Threads_idle | 0 |
| Ready_queue | 0 |
| Idle_queue | 0 |
| Context_switches | 1626 |
| Context_switch_misses | 0 |
| Uptime | 22270 |
| Auto_attach_remaining_seconds | 0 |
| Data_directory | /var/lib/memsql/leaf-3307/data |
| Plancache_directory | /var/lib/memsql/leaf-3307/plancache |
| Transaction_logs_directory | /var/lib/memsql/leaf-3307/data/logs |
| Segments_directory | /var/lib/memsql/leaf-3307/data/columns |
| Snapshots_directory | /var/lib/memsql/leaf-3307/data/snapshots |
| Threads_waiting_for_disk_space | 0 |
| Seconds_until_expiration | -1 |
| License_key | 11111111111111111111111111111111 |
| License_type | community |
| Query_compilations | 62 |
| Query_compilation_failures | 0 |
| GCed_versions_last_sweep | 0 |
| Average_garbage_collection_duration | 21 ms |
| Total_server_memory | 9791.4 MB |
| Alloc_thread_stacks | 70.0 MB |
| Malloc_active_memory | 1254.7 (+0.0) MB |
| Malloc_cumulative_memory | 7315.5 (+0.2) MB |
| Buffer_manager_memory | 1787.8 MB |
| Buffer_manager_cached_memory | 77.2 (-0.1) MB |
| Buffer_manager_unrecycled_memory | 0.0 MB |
| Alloc_skiplist_tower | 263.8 MB |
| Alloc_variable | 501.4 MB |
| Alloc_large_variable | 2.4 MB |
| Alloc_table_primary | 752.6 MB |
| Alloc_deleted_version | 92.9 MB |
| Alloc_internal_key_node | 72.1 MB |
| Alloc_hash_buckets | 459.1 MB |
| Alloc_table_metadata_cache | 1.1 MB |
| Alloc_unit_images | 34.8 MB |
| Alloc_unit_ifn_thunks | 0.6 MB |
| Alloc_object_code_images | 11.6 MB |
| Alloc_compiled_unit_sections | 17.3 MB |
| Alloc_databases_list_entry | 17.9 MB |
| Alloc_plan_cache | 0.1 MB |
| Alloc_replication_large | 232.0 MB |
| Alloc_durability_large | 7239.1 MB |
| Alloc_sharding_partitions | 0.1 MB |
| Alloc_security | 0.1 MB |
| Alloc_log_replay | 0.9 MB |
| Alloc_client_connection | 3.0 MB |
| Alloc_protocol_packet | 6.1 (+0.1) MB |
| Alloc_large_incremental | 0.8 MB |
| Alloc_table_memory | 2144.2 MB |
| Alloc_variable_bucket_16 | allocs:10877846 alloc_MB:166.0 buffer_MB:179.0 cached_buffer_MB:1.9 |
| Alloc_variable_bucket_24 | allocs:4275659 alloc_MB:97.9 buffer_MB:106.8 cached_buffer_MB:1.9 |
| Alloc_variable_bucket_32 | allocs:2875801 alloc_MB:87.8 buffer_MB:93.4 cached_buffer_MB:1.9 |
| Alloc_variable_bucket_40 | allocs:724489 alloc_MB:27.6 buffer_MB:31.0 cached_buffer_MB:1.2 |
| Alloc_variable_bucket_48 | allocs:377060 alloc_MB:17.3 buffer_MB:19.8 cached_buffer_MB:0.9 |
| Alloc_variable_bucket_56 | allocs:228720 alloc_MB:12.2 buffer_MB:14.0 cached_buffer_MB:0.8 |
| Alloc_variable_bucket_64 | allocs:150214 alloc_MB:9.2 buffer_MB:10.1 cached_buffer_MB:0.2 |
| Alloc_variable_bucket_72 | allocs:35264 alloc_MB:2.4 buffer_MB:2.9 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_80 | allocs:14920 alloc_MB:1.1 buffer_MB:1.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_88 | allocs:5582 alloc_MB:0.5 buffer_MB:0.6 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_104 | allocs:8075 alloc_MB:0.8 buffer_MB:1.0 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_128 | allocs:8892 alloc_MB:1.1 buffer_MB:1.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_160 | allocs:17614 alloc_MB:2.7 buffer_MB:3.0 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_200 | allocs:30454 alloc_MB:5.8 buffer_MB:6.9 cached_buffer_MB:0.6 |
| Alloc_variable_bucket_248 | allocs:4875 alloc_MB:1.2 buffer_MB:1.5 cached_buffer_MB:0.2 |
| Alloc_variable_bucket_312 | allocs:371 alloc_MB:0.1 buffer_MB:0.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_384 | allocs:30 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_480 | allocs:11 alloc_MB:0.0 buffer_MB:0.1 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_600 | allocs:57 alloc_MB:0.0 buffer_MB:0.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_752 | allocs:62 alloc_MB:0.0 buffer_MB:0.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_936 | allocs:42 alloc_MB:0.0 buffer_MB:0.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_1168 | allocs:106 alloc_MB:0.1 buffer_MB:0.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_1480 | allocs:126 alloc_MB:0.2 buffer_MB:0.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_1832 | allocs:0 alloc_MB:0.0 buffer_MB:0.2 cached_buffer_MB:0.2 |
| Alloc_variable_bucket_2288 | allocs:1 alloc_MB:0.0 buffer_MB:0.2 cached_buffer_MB:0.1 |
| Alloc_variable_bucket_2832 | allocs:33 alloc_MB:0.1 buffer_MB:1.1 cached_buffer_MB:0.2 |
| Alloc_variable_bucket_3528 | allocs:16 alloc_MB:0.1 buffer_MB:0.5 cached_buffer_MB:0.1 |
| Alloc_variable_bucket_4504 | allocs:49 alloc_MB:0.2 buffer_MB:0.8 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_5680 | allocs:66 alloc_MB:0.4 buffer_MB:1.2 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_6224 | allocs:30 alloc_MB:0.2 buffer_MB:1.0 cached_buffer_MB:0.1 |
| Alloc_variable_bucket_7264 | allocs:94 alloc_MB:0.7 buffer_MB:1.5 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_9344 | allocs:70 alloc_MB:0.6 buffer_MB:2.6 cached_buffer_MB:0.2 |
| Alloc_variable_bucket_11896 | allocs:14 alloc_MB:0.2 buffer_MB:2.4 cached_buffer_MB:1.2 |
| Alloc_variable_bucket_14544 | allocs:7 alloc_MB:0.1 buffer_MB:2.4 cached_buffer_MB:1.9 |
| Alloc_variable_bucket_18696 | allocs:18 alloc_MB:0.3 buffer_MB:3.2 cached_buffer_MB:1.9 |
| Alloc_variable_bucket_21816 | allocs:4 alloc_MB:0.1 buffer_MB:0.4 cached_buffer_MB:0.0 |
| Alloc_variable_bucket_26184 | allocs:6 alloc_MB:0.1 buffer_MB:0.9 cached_buffer_MB:0.2 |
| Alloc_variable_bucket_32728 | allocs:13 alloc_MB:0.4 buffer_MB:2.4 cached_buffer_MB:1.4 |
| Alloc_variable_bucket_43648 | allocs:12 alloc_MB:0.5 buffer_MB:1.4 cached_buffer_MB:0.2 |
| Alloc_variable_bucket_65472 | allocs:7 alloc_MB:0.4 buffer_MB:2.8 cached_buffer_MB:1.9 |
| Alloc_variable_bucket_130960 | allocs:3 alloc_MB:0.4 buffer_MB:2.2 cached_buffer_MB:1.9 |
| Alloc_variable_cached_buffers | 21.4 MB |
| Alloc_variable_allocated | 438.7 MB |
| Successful_read_queries | 9048 |
| Successful_write_queries | 19096 |
| Failed_read_queries | 0 |
| Failed_write_queries | 4 |
| Rows_returned_by_reads | 75939 |
| Rows_affected_by_writes | 245 |
| Execution_time_of_reads | 7864 ms |
| Execution_time_of_write | 180311 ms |
| Transaction_buffer_wait_time | 0 ms |
| Transaction_log_flush_wait_time | 0 ms |
| Row_lock_wait_time | 0 ms |
| Ssl_accept_renegotiates | 0 |
| Ssl_accepts | 0 |
| Ssl_callback_cache_hits | 0 |
| Ssl_client_connects | 0 |
| Ssl_connect_renegotiates | 0 |
| Ssl_ctx_verify_depth | 18446744073709551615 |
| Ssl_ctx_verify_mode | 0 |
| Ssl_default_timeout | 0 |
| Ssl_finished_accepts | 0 |
| Ssl_finished_connects | 0 |
| Ssl_session_cache_hits | 0 |
| Ssl_session_cache_misses | 0 |
| Ssl_session_cache_overflows | 0 |
| Ssl_session_cache_size | 20480 |
| Ssl_session_cache_timeouts | 0 |
| Ssl_sessions_reused | 0 |
| Ssl_used_session_cache_entries | 0 |
| Ssl_verify_depth | 0 |
| Ssl_verify_mode | 0 |
| Ssl_cipher | |
| Ssl_cipher_list | |
| Ssl_version | |
| Ssl_session_cache_mode | SERVER |
+-------------------------------------+------------------------------------------------------------------------+
From the status output, we can see:
10GB total memory on the leaf node
7GB Alloc_durability_large
You can see what these variables mean here: https://help.memsql.com/hc/en-us/articles/115001091386-What-Is-Using-Memory-on-My-Leaves-
Most interesting is the large amount in Alloc_durability_large, which is unusual. Do you have a large number of databases and/or partitions? (You can check by counting the number of rows in SHOW DATABASES EXTENDED on the leaf nodoe.) Each will require a fixed amount of transaction buffer memory (default is 64 MB).

"Extract" intervals from series in Google Sheets

If I in Google Sheets have a series defined as
[29060, 29062, 29331, 29332, 29333, 29334, 29335, 29336, 29337, 29338, 29339, 29340, 29341,
29342, 29372, 29373].
How do I make them line up in intervals like this?
|To |From |
|29060 |29062 |
|29331 |29342 |
|29372 |29373 |
I can't find any good answers for this anywhere. Please, help!
Data/Formulas
A1:
29060, 29062, 29331, 29332, 29333, 29334, 29335, 29336, 29337, 29338, 29339, 29340, 29341,
29342, 29372, 29373
B1: =transpose(split(A1,",")). Converts the input text is an a vertical array.
C1: =FILTER(B1:B16,mod(ROW(B1:B16),2)<>0). Returns values in odd rows.
D1: =FILTER(B1:B16,mod(ROW(B1:B16),2)=0). Returns values in even rows.
E1: =ArrayFormula(FILTER(C1:C8,{TRUE();C2:C8<>D1:D7+1})). Returns values that start a range.
F1: =ArrayFormula(FILTER(D1:D8,{D1:D7+2<>D2:D8;TRUE()})). Returns values that end a range.
Result
Note: A1 values are not shown for readability.
+----+---+-------+-------+-------+-------+-------+
| | A | B | C | D | E | F |
+----+---+-------+-------+-------+-------+-------+
| 1 | | 29060 | 29060 | 29062 | 29060 | 29062 |
| 2 | | 29062 | 29331 | 29332 | 29331 | 29342 |
| 3 | | 29331 | 29333 | 29334 | 29372 | 29373 |
| 4 | | 29332 | 29335 | 29336 | | |
| 5 | | 29333 | 29337 | 29338 | | |
| 6 | | 29334 | 29339 | 29340 | | |
| 7 | | 29335 | 29341 | 29342 | | |
| 8 | | 29336 | 29372 | 29373 | | |
| 9 | | 29337 | | | | |
| 10 | | 29338 | | | | |
| 11 | | 29339 | | | | |
| 12 | | 29340 | | | | |
| 13 | | 29341 | | | | |
| 14 | | 29342 | | | | |
| 15 | | 29372 | | | | |
| 16 | | 29373 | | | | |
+----+---+-------+-------+-------+-------+-------+

Auto layout errors when adding "line" to view on iOS 7

I have seven elements side by side for which auto layout should calculate its width and the parent view should determine it's height. On this parent view I want to add a line to separate the areas more clearly. When adding this line (a UIView with a background color) I get some black bars at the right side of the parent view (container for the seven elements) and also one of the other views.
If I don't add this line it does work fine on iOS 7. On iOS 8 it does work even when the line is present. These are my auto layout constraints for the line:
V:[line(==0.5)]| and H:|-0-[line]-0-|
The seven labels have the following constraints:
H:|-0-[first][second(==first)][third(==first)][fourth(==first)][fifth(==first)][sixt(==first)][seventh(==first)]-0-|
and
V:|[first]|
and so on.
Why does adding the line not work for iOS 7 auto layout system? How can I add my line correctly?
Edit:
My seven labels are in a container view - the header. The header is placed together with a collection view (added as child view controller) on a normal view controller. This "main" view controller (or container if you want) has the following constraints:
H:|-0-[header]-0-| - header holding the seven labels
H:|-0-[monthView]-0-| - collection view
H:|-0-[toolbar]-0-| - toolbar
V:|-0-[header(==15)][monthView][toolbar(==44)]-0-| - vertical position all three elements
I got it managed to get an auto layout trace once all views where loaded:
*<UIWindow:0x7897f370> - AMBIGUOUS LAYOUT
| *<UILayoutContainerView:0x78eb09f0>
| | *<UINavigationTransitionView:0x78db1b50>
| | | *<UIViewControllerWrapperView:0x78dbb400>
| | | | *<UIView:0x7897c780>
| | | | | *<MonthView_MonthViewHeader:0x78ddd600>
| | | | | | *<UILabel:0x78ddede0>
| | | | | | *<UILabel:0x78de29e0>
| | | | | | *<UILabel:0x78de2a90>
| | | | | | *<UILabel:0x78de2bc0>
| | | | | | *<UILabel:0x78de2de0>
| | | | | | *<UILabel:0x78de3140>
| | | | | | *<UILabel:0x78de3360>
| | | | | | *<UIView:0x78de3580>
| | | | | *<UICollectionViewControllerWrapperView:0x78ebdd00>
| | | | | | *<UICollectionView:0x795c1a00>
| | | | | | | *<MonthView_MonthViewCell:0x78ed1270>
| | | | | | | | <UIView:0x78d2cab0>
| | | | | | | | *<UIView:0x78d2d660>
| | | | | | | | | *<UIView:0x78d2c8b0>
| | | | | | | | | *<UILabel:0x78d2c6a0>
| | | | | | | *<MonthView_MonthViewCell:0x78d29450>
| | | | | | | | <UIView:0x78d290d0>
| | | | | | | | *<UIView:0x78d293f0>
| | | | | | | | | *<UIView:0x78d29020>
| | | | | | | | | *<UILabel:0x78d28f10>
| | | | | | | *<MonthView_MonthViewCell:0x78d27ec0>
| | | | | | | | <UIView:0x78d27c30>
| | | | | | | | *<UIView:0x78d27e60>
| | | | | | | | | *<UIView:0x78d27b80>
| | | | | | | | | *<UILabel:0x78d27a70>
| | | | | | | *<MonthView_MonthViewCell:0x78d26a20>
| | | | | | | | <UIView:0x78d26790>
| | | | | | | | *<UIView:0x78d269c0>
| | | | | | | | | *<UIView:0x78d266e0>
| | | | | | | | | *<UILabel:0x78d265d0>
| | | | | | | *<MonthView_MonthViewCell:0x78d253e0>
| | | | | | | | <UIView:0x78d25170>
| | | | | | | | *<UIView:0x78d25380>
| | | | | | | | | *<UIView:0x78d250c0>
| | | | | | | | | *<UILabel:0x78d24fb0>
| | | | | | | *<MonthView_MonthViewCell:0x78d23f60>
| | | | | | | | <UIView:0x78d23cd0>
| | | | | | | | *<UIView:0x78d23f00>
| | | | | | | | | *<UIView:0x78d23c20>
| | | | | | | | | *<UILabel:0x78d23b10>
| | | | | | | *<MonthView_MonthViewCell:0x78d1ed90>
| | | | | | | | <UIView:0x78d1eb00>
| | | | | | | | *<UIView:0x78d1ed30>
| | | | | | | | | *<UIView:0x78d1ea50>
| | | | | | | | | *<UILabel:0x78d1e940>
| | | | | | | *<MonthView_MonthViewCell:0x78d1d860>
| | | | | | | | <UIView:0x78d1d610>
| | | | | | | | *<UIView:0x78d1d800>
| | | | | | | | | *<UIView:0x78d1d560>
| | | | | | | | | *<UILabel:0x78d1d450>
| | | | | | | *<MonthView_MonthViewCell:0x78d1c3f0>
| | | | | | | | <UIView:0x78d1c160>
| | | | | | | | *<UIView:0x78d1c390>
| | | | | | | | | *<UIView:0x78d1c0b0>
| | | | | | | | | *<UILabel:0x78d1bfa0>
| | | | | | | *<MonthView_MonthViewCell:0x78d1af40>
| | | | | | | | <UIView:0x78d1acb0>
| | | | | | | | *<UIView:0x78d1aee0>
| | | | | | | | | *<UIView:0x78d1ac00>
| | | | | | | | | *<UILabel:0x78d1aaf0>
| | | | | | | *<MonthView_MonthViewCell:0x78d19a90>
| | | | | | | | <UIView:0x78d19800>
| | | | | | | | *<UIView:0x78d19a30>
| | | | | | | | | *<UIView:0x78d19750>
| | | | | | | | | *<UILabel:0x78d19640>
| | | | | | | *<MonthView_MonthViewCell:0x78d185e0>
| | | | | | | | <UIView:0x78d18350>
| | | | | | | | *<UIView:0x78d18580>
| | | | | | | | | *<UIView:0x78d182a0>
| | | | | | | | | *<UILabel:0x78d18190>
| | | | | | | *<MonthView_MonthViewCell:0x78d170c0>
| | | | | | | | <UIView:0x78d16e60>
| | | | | | | | *<UIView:0x78d17060>
| | | | | | | | | *<UIView:0x78d16db0>
| | | | | | | | | *<UILabel:0x78d16ca0>
| | | | | | | *<MonthView_MonthViewCell:0x78d15c40>
| | | | | | | | <UIView:0x78d159b0>
| | | | | | | | *<UIView:0x78d15be0>
| | | | | | | | | *<UIView:0x78d15900>
| | | | | | | | | *<UILabel:0x78d157f0>
| | | | | | | *<MonthView_MonthViewCell:0x78d14790>
| | | | | | | | <UIView:0x78d14500>
| | | | | | | | *<UIView:0x78d14730>
| | | | | | | | | *<UIView:0x78d14450>
| | | | | | | | | *<UILabel:0x78d14340>
| | | | | | | *<MonthView_MonthViewCell:0x78d132e0>
| | | | | | | | <UIView:0x78d13050>
| | | | | | | | *<UIView:0x78d13280>
| | | | | | | | | *<UIView:0x78d12fa0>
| | | | | | | | | *<UILabel:0x78d12e90>
| | | | | | | *<MonthView_MonthViewCell:0x78d11e30>
| | | | | | | | <UIView:0x78d11ba0>
| | | | | | | | *<UIView:0x78d11dd0>
| | | | | | | | | *<UIView:0x78d11af0>
| | | | | | | | | *<UILabel:0x78d119e0>
| | | | | | | *<MonthView_MonthViewCell:0x78d10980>
| | | | | | | | <UIView:0x78d106f0>
| | | | | | | | *<UIView:0x78d10920>
| | | | | | | | | *<UIView:0x78d10640>
| | | | | | | | | *<UILabel:0x78d10530>
| | | | | | | *<MonthView_MonthViewCell:0x78d0f4d0>
| | | | | | | | <UIView:0x78d0f240>
| | | | | | | | *<UIView:0x78d0f470>
| | | | | | | | | *<UIView:0x78d0f190>
| | | | | | | | | *<UILabel:0x78d0f080>
| | | | | | | *<MonthView_MonthViewCell:0x78d0e020>
| | | | | | | | <UIView:0x78d0dd90>
| | | | | | | | *<UIView:0x78d0dfc0>
| | | | | | | | | *<UIView:0x78d0dce0>
| | | | | | | | | *<UILabel:0x78d0dbd0>
| | | | | | | *<MonthView_MonthViewCell:0x78d0c960>
| | | | | | | | <UIView:0x78d0c740>
| | | | | | | | *<UIView:0x78d17190>
| | | | | | | | | *<UIView:0x78d0c690>
| | | | | | | | | *<UILabel:0x78d0c580>
| | | | | | | *<MonthView_MonthViewCell:0x78d0b520>
| | | | | | | | <UIView:0x78d0b290>
| | | | | | | | *<UIView:0x78d0b4c0>
| | | | | | | | | *<UIView:0x78d0b1e0>
| | | | | | | | | *<UILabel:0x78d0b0d0>
| | | | | | | *<MonthView_MonthViewCell:0x78d0a070>
| | | | | | | | <UIView:0x78d09de0>
| | | | | | | | *<UIView:0x78d0a010>
| | | | | | | | | *<UIView:0x78d09d30>
| | | | | | | | | *<UILabel:0x78d09c20>
| | | | | | | *<MonthView_MonthViewCell:0x78d08bc0>
| | | | | | | | <UIView:0x78d08930>
| | | | | | | | *<UIView:0x78d08b60>
| | | | | | | | | *<UIView:0x78d08880>
| | | | | | | | | *<UILabel:0x78d08770>
| | | | | | | *<MonthView_MonthViewCell:0x78d07710>
| | | | | | | | <UIView:0x78d07480>
| | | | | | | | *<UIView:0x78d076b0>
| | | | | | | | | *<UIView:0x78d073d0>
| | | | | | | | | *<UILabel:0x78d072c0>
| | | | | | | *<MonthView_MonthViewCell:0x78d06150>
| | | | | | | | <UIView:0x78d05ec0>
| | | | | | | | *<UIView:0x78d060f0>
| | | | | | | | | *<UIView:0x78d05e10>
| | | | | | | | | *<UILabel:0x78d05d00>
| | | | | | | *<MonthView_MonthViewCell:0x78d04ca0>
| | | | | | | | <UIView:0x78d04a10>
| | | | | | | | *<UIView:0x78d04c40>
| | | | | | | | | *<UIView:0x78d04960>
| | | | | | | | | *<Circle:0x78d3c6b0>
| | | | | | | | | *<UILabel:0x78d04850>
| | | | | | | *<MonthView_MonthViewCell:0x78d037f0>
| | | | | | | | <UIView:0x78d03560>
| | | | | | | | *<UIView:0x78d03790>
| | | | | | | | | *<UIView:0x78d034b0>
| | | | | | | | | *<UILabel:0x78d033a0>
| | | | | | | *<MonthView_MonthViewCell:0x78d02270>
| | | | | | | | <UIView:0x78d01fe0>
| | | | | | | | *<UIView:0x78d02210>
| | | | | | | | | *<UIView:0x78d01f30>
| | | | | | | | | *<UILabel:0x78d01e20>
| | | | | | | *<MonthView_MonthViewCell:0x78d00dc0>
| | | | | | | | <UIView:0x78d00b30>
| | | | | | | | *<UIView:0x78d00d60>
| | | | | | | | | *<UIView:0x78e64000>
| | | | | | | | | *<UILabel:0x78ed2dd0>
| | | | | | | *<MonthView_MonthViewCell:0x78ed3e60>
| | | | | | | | <UIView:0x78ed4130>
| | | | | | | | *<UIView:0x78ed3f00>
| | | | | | | | | *<UIView:0x78ed41e0>
| | | | | | | | | *<UILabel:0x78ed42a0>
| | | | | | | *<MonthView_MonthViewCell:0x78ed5310>
| | | | | | | | <UIView:0x78ed55e0>
| | | | | | | | *<UIView:0x78ed53b0>
| | | | | | | | | *<UIView:0x78ed5690>
| | | | | | | | | *<UILabel:0x78ed5750>
| | | | | | | *<MonthView_MonthViewCell:0x78ed67c0>
| | | | | | | | <UIView:0x78ed6a90>
| | | | | | | | *<UIView:0x78ed6860>
| | | | | | | | | *<UIView:0x78ed6b40>
| | | | | | | | | *<UILabel:0x78ed6c00>
| | | | | | | *<MonthView_MonthViewCell:0x78ed7eb0>
| | | | | | | | <UIView:0x78ed8180>
| | | | | | | | *<UIView:0x78ed7f50>
| | | | | | | | | *<UIView:0x78ed8230>
| | | | | | | | | *<UILabel:0x78ed82f0>
| | | | | | | *<MonthView_MonthViewCell:0x78ed9360>
| | | | | | | | <UIView:0x78ed9630>
| | | | | | | | *<UIView:0x78ed9400>
| | | | | | | | | *<UIView:0x78ed96e0>
| | | | | | | | | *<UILabel:0x78ed97a0>
| | | | | | | *<MonthView_SectionHeader:0x78edb330>
| | | | | | | | *<UIView:0x78eda940>
| | | | | | | | *<UIView:0x78edc740>
| | | | | | | | *<UIView:0x78edc7a0>
| | | | | | | | *<UIView:0x78edca50>
| | | | | | | | *<UIView:0x78edcae0>
| | | | | | | | *<UIView:0x78edcb70>
| | | | | | | | *<UIView:0x78edcc00>
| | | | | | | | *<UILabel:0x78edcc90>
| | | | | | | <UIImageView:0x78df2690>
| | | | | | | <UIImageView:0x78df1c40>
| | | | | | *<MonthView_SectionOverlay:0x78dc8fd0>
| | | | | | | *<GradientView:0x78dcc780>
| | | | | | | *<UILabel:0x78dd3ea0>
| | | | | | *<_UILayoutGuide:0x78dc8a20> - AMBIGUOUS LAYOUT
| | | | | *<MonthView_Toolbar:0x78de7540>
| | | | | | <_UIToolbarBackground:0x78de7c40>
| | | | | | | <_UIBackdropView:0x78d3e940>
| | | | | | | | <_UIBackdropEffectView:0x78d3e730>
| | | | | | | | <UIView:0x78d3e650>
| | | | | | <UIImageView:0x78de8c70>
| | | | | | <UIToolbarTextButton:0x78dfa240>
| | | | | | | <_UIToolbarNavigationButton:0x78df9aa0>
| | | | | | | | <UIButtonLabel:0x78ef69f0>
| | | | | | <UIToolbarTextButton:0x78dfaa50>
| | | | | | | <_UIToolbarNavigationButton:0x78d3eb00>
| | | | | | | | <UIButtonLabel:0x78ef5740>
| | <UINavigationBar:0x78eb1ad0>
| | | <_UINavigationBarBackground:0x78eb2170>
| | | | <_UIBackdropView:0x78db8270>
| | | | | <_UIBackdropEffectView:0x78dba8d0>
| | | | | <UIView:0x78dbb3a0>
| | | | <UIImageView:0x78eb24e0>
| | | <_UINavigationBarBackIndicatorView:0x78eb4f30>
Here you can find the recursive description after the view was loaded.
I think I'll never find out what is wrong with the auto layout system on iOS 7. But I now use a workaround: Instead of adding the line to the same view like the seven labels I add it to another container. Pseudo-Code:
weekdayHeader = [label1]-[label7]
monthViewHeader = [weekdayHeader][line(==0.5)]
And the monthViewHeader is added together with my other views (UICollectionView, UIToolbar) to my main view controller/container.

Why do I see NSAutoresizingMaskLayoutConstraint when all UI Elements are set to setTranslatesAutoresizingMaskIntoConstraints = NO

I am trying to debug a UI Layout and all the elements I have added in the code are labelled with [self.element setTranslatesAutoresizingMaskIntoConstraints:NO]; The only thing that is set in XIB file is the background color of the view (one of many views in a tabbed viewController.
When I look at the NSLog I am seeing the following:
*<UIWindow:0xc352370> - AMBIGUOUS LAYOUT
| *<UILayoutContainerView:0xc3651b0>
| | *<UINavigationTransitionView:0xc355b40>
| | | *<UIViewControllerWrapperView:0xbd3e250>
| | | | *<UILayoutContainerView:0xbd3da60>
| | | | | *<UITransitionView:0xbd46ed0>
| | | | | | *<UIViewControllerWrapperView:0xc09a450>
| | | | | | | *<UIView:0xbd51f40>
| | | | | | | | *<_UILayoutGuide:0xbd51fa0> - AMBIGUOUS LAYOUT
| | | | | | | | *<_UILayoutGuide:0xbd50a10> - AMBIGUOUS LAYOUT
| | | | | | | | *<UIButton:0xc064170> - AMBIGUOUS LAYOUT
| | | | | | | | | *<UIButtonLabel:0xc09d640>
| | | | | | | | *<UILabel:0xc073990> - AMBIGUOUS LAYOUT
| | | | | | | | *<UIButton:0xc0576a0> - AMBIGUOUS LAYOUT
| | | | | | | | | *<UIButtonLabel:0xc095290>
| | | | | | | | *<UIButton:0xc096640> - AMBIGUOUS LAYOUT
| | | | | | | | | <UIButtonLabel:0xc096820>
| | | | | | | | *<UIButton:0xc098b70> - AMBIGUOUS LAYOUT
| | | | | | | | | <UIButtonLabel:0xc098cb0>
| | | | | | | | *<UIButton:0xc09a4c0> - AMBIGUOUS LAYOUT
| | | | | | | | | <UIButtonLabel:0xc09a6d0>
| | | | | | | | *<UILabel:0xc09c9d0> - AMBIGUOUS LAYOUT
| | | | | | | | *<UILabel:0xc09cc60> - AMBIGUOUS LAYOUT
| | | | | | | | *<UIButton:0xc09ce00> - AMBIGUOUS LAYOUT
| | | | | | | | | <UIButtonLabel:0xc09d010>
| | | | | | | | *<UILabel:0xc0a25f0> - AMBIGUOUS LAYOUT
| | | | | | | | *<UIButton:0xc0a2800> - AMBIGUOUS LAYOUT
| | | | | | | | | <UIButtonLabel:0xc0a2a10>
| | | | | | | | *<UILabel:0xc0a5720> - AMBIGUOUS LAYOUT
| | | | | <UITabBar:0xc356c00>
| | | | | | <_UITabBarBackgroundView:0xbe28cc0>
| | | | | | | <_UIBackdropView:0xbe29100>
| | | | | | | | <_UIBackdropEffectView:0xbe296e0>
| | | | | | | | <UIView:0xbe29780>
| | | | | | <UITabBarButton:0xbd42000>
| | | | | | | <UITabBarSwappableImageView:0xbd41050>
| | | | | | | <UITabBarButtonLabel:0xbd43320>
| | | | | | <UITabBarButton:0xbd462e0>
| | | | | | | <UITabBarSwappableImageView:0xbd45d60>
| | | | | | | <UITabBarButtonLabel:0xbd45c70>
| | | | | | <UITabBarButton:0xbd47770>
| | | | | | | <UITabBarSwappableImageView:0xbd48a90>
| | | | | | | <UITabBarButtonLabel:0xbd486c0>
| | | | | | <UITabBarButton:0xbd4c0c0>
| | | | | | | <UITabBarSwappableImageView:0xbd4c220>
| | | | | | | <UITabBarButtonLabel:0xbd4aea0>
| | | | | | <UIImageView:0xbe29ed0>
| | <UINavigationBar:0xc06c4a0>
| | | <_UINavigationBarBackground:0xc05e720>
| | | | <_UIBackdropView:0xc357d70>
| | | | | <_UIBackdropEffectView:0xc3639a0>
| | | | | <UIView:0xc355470>
| | | | <UIImageView:0xc071980>
| | | <UINavigationItemView:0xc074c80>
| | | | <UILabel:0xc083730>
| | | <_UINavigationBarBackIndicatorView:0xc36edb0>
po [
(lldb) po [0xbd51fa0 constraintsAffectingLayoutForAxis:0]
<__NSArrayM 0xbd39190>(
)
(lldb) po [0xbd50a10 constraintsAffectingLayoutForAxis:0]
<__NSArrayM 0x1121f8f0>(
)
(lldb) po [0xc064170 constraintsAffectingLayoutForAxis:0]
<__NSArrayM 0xc0a14e0>(
<NSLayoutConstraint:0xc093aa0 H:|-(NSSpace(20))-[UIButton:0xc064170] (Names: '|':UIView:0xbd51f40 )>,
<NSLayoutConstraint:0xc093b30 UIButton:0xc064170.width == UIButton:0xc0576a0.width>,
<NSLayoutConstraint:0xc093be0 H:[UIButton:0xc064170]-(20)-[UILabel:0xc073990]>,
<NSLayoutConstraint:0xc093c10 UILabel:0xc073990.width == UIButton:0xc0576a0.width>,
<NSLayoutConstraint:0xc096530 H:[UILabel:0xc073990]-(20)-[UIButton:0xc0576a0]>,
<NSLayoutConstraint:0xc096590 H:[UIButton:0xc0576a0]-(NSSpace(20))-| (Names: '|':UIView:0xbd51f40 )>,
<NSAutoresizingMaskLayoutConstraint:0xc0af130 h=--& v=--& H:[UIView:0xbd51f40(768)]>
)
(lldb) po [0xc09a4c0 constraintsAffectingLayoutForAxis:0]
<__NSArrayM 0xc36a0a0>(
<NSLayoutConstraint:0xc09c9a0 H:|-(<=0)-[UIButton:0xc09a4c0] (Names: '|':UIView:0xbd51f40 )>,
<NSContentSizeLayoutConstraint:0xc0ad530 H:[UIButton:0xc09a4c0(110)] Hug:250 CompressionResistance:750>
As you can see from po commands, I am getting NSAutoresizingMasktLayoutContraints. I thought this shouldn't happen?
How can I ensure that I don't get this?
Ok.. I have figured out that the constraints did not appropriately address the relationship between each other. I have updated the code and now I can get clean constraints for all aspects, except for the main UIWindow, which I am sure is just because of the problem with the UILayoutGuide being ambiguous -
*<UIWindow:0xcc6d7f0> - AMBIGUOUS LAYOUT
| *<UILayoutContainerView:0xbd34720>
| | *<UINavigationTransitionView:0xbd4b140>
| | | *<UIViewControllerWrapperView:0xcc48400>
| | | | *<UILayoutContainerView:0xbd5ad40>
| | | | | *<UITransitionView:0xbd635c0>
| | | | | | *<UIViewControllerWrapperView:0xcca68b0>
| | | | | | | *<UIView:0xcc9ddd0>
| | | | | | | | *<_UILayoutGuide:0xcc9de30> - AMBIGUOUS LAYOUT
| | | | | | | | *<_UILayoutGuide:0xcc9e090> - AMBIGUOUS LAYOUT
| | | | | | | | *<UIButton:0xcc9d910>
| | | | | | | | | <UIImageView:0xcc9dd20>
| | | | | | | | *<UILabel:0xcc9e860>
| | | | | | | | *<UIButton:0xcc9ebb0>
| | | | | | | | | <UIImageView:0xcc9eda0>
| | | | | | | | *<UIButton:0xcc9f020>
| | | | | | | | | <UIButtonLabel:0xcc9f470>
| | | | | | | | *<UIButton:0xcca0620>
| | | | | | | | | <UIButtonLabel:0xcca0830>
| | | | | | | | *<UIButton:0xcca15d0>
| | | | | | | | | *<UIButtonLabel:0xcca17e0>
| | | | | | | | *<UILabel:0xcca2580>
| | | | | | | | *<UILabel:0xcca2710>
| | | | | | | | *<UIButton:0xcca28d0>
| | | | | | | | | *<UIButtonLabel:0xcca2ae0>
| | | | | | | | *<UILabel:0xcca3880>
| | | | | | | | *<UILabel:0xcca3bd0>
| | | | | | | | *<UIButton:0xcca3e30>
| | | | | | | | | *<UIButtonLabel:0xcca4040>
| | | | | | | | *<UILabel:0xcca4de0>
| | | | | | | | *<UILabel:0xcca5000>
| | | | | <UITabBar:0xcc69e40>
| | | | | | <_UITabBarBackgroundView:0xcccad10>
| | | | | | | <_UIBackdropView:0xcccb150>
| | | | | | | | <_UIBackdropEffectView:0xcccb730>
| | | | | | | | <UIView:0xcccb7d0>
| | | | | | <UITabBarButton:0xcc96bf0>
| | | | | | | <UITabBarSwappableImageView:0xcc8d330>
| | | | | | | <UITabBarButtonLabel:0xcc8d270>
| | | | | | <UITabBarButton:0xcc98830>
| | | | | | | <UITabBarSwappableImageView:0xcc81840>
| | | | | | | <UITabBarButtonLabel:0xcc983a0>
| | | | | | <UITabBarButton:0xcc9a050>
| | | | | | | <UITabBarSwappableImageView:0xcc8e3e0>
| | | | | | | <UITabBarButtonLabel:0xcc9a130>
| | | | | | <UITabBarButton:0xcc9a560>
| | | | | | | <UITabBarSwappableImageView:0xcc9b140>
| | | | | | | <UITabBarButtonLabel:0xcc9aa20>
| | | | | | <UIImageView:0xcccbf20>
| | <UINavigationBar:0xbd27340>
| | | <_UINavigationBarBackground:0xbd2fc60>
| | | | <_UIBackdropView:0xbd43740>
| | | | | <_UIBackdropEffectView:0xbd2c200>
| | | | | <UIView:0xbd46100>
| | | | <UIImageView:0xbd2ffd0>
| | | <UINavigationItemView:0xbd1e270>
| | | | <UILabel:0xbd283c0>
| | | <_UINavigationBarBackIndicatorView:0xbd566e0>
The UILayoutGuide is not set anywhere in my code, so I am not quite sure what do with it.

Error Copying DBF/MDX files

I use the following code to copy dbf/mdx files from one folder to another:
procedure TfrmMain.MyCopyFile(S1, S2: string);
begin
if not FileExists(S2) then
CopyFile(PCHAR(S1), PCHAR(S2), true)
else
if Application.MessageBox(PCHAR('Overwrite existing file ' + S2 + '?'), 'File exists in folder',MB_YESNO + MB_DEFBUTTON1) = IDYES
then CopyFile(PCHAR(S1), PCHAR(S2), false)
end;
The code works fine when table name stays the same.
If I change the name of the table:
MyCopyFile(CurPath + '\orders.dbf', NewPath + '\ordly.dbf');
MyCopyFile(CurPath + '\orders.mdx', NewPath + '\ordly.mdx');
When I try to open ordly.dbf I get an error message:
Corrupt table/index header.
File: C:\DATA\2011\ORDLY.MDX
the problem is due to the mdx format stores inside the name of the data file associated (table name). because that when you rename a mdxfile the index still points to the old name of data file.
check this link to see the structure of the mdx file.
The Structure of Multiple Index files (*.mdx)
0 | Version number *1| ^
|-----------------------| |
1 | Date of creation | |
2 | YYMMDD | |
3 | | |
|-----------------------| |
4 | Data file name | File
5 | (no extension) | Header
: : |
: : |
19 | | |
|-----------------------| |
20 | Block size | |
| | |
|-----------------------| |
22 | Block size adder N | |
| | |
|-----------------------| |
24 | Production index flag | |
|-----------------------| |
25 | No. of entries in tag | | *2
|-----------------------| |
26 | Length of tag | | *3
|-----------------------| |
27 | (Reserved) | |
|-----------------------| |
28 | No.of tags in use | |
| | |
|-----------------------| |
30 | (Reserved) | |
| | |
|-----------------------| |
32 | No.of pages in tagfile| |
| | |
| | |
35 | | |
|-----------------------| |
36 | Pointer to first free | |
| page | |
| | |
39 | | |
|-----------------------| |
40 | No.of block available | |
| | |
| | |
43 | | |
|-----------------------| |
44 | Date of last update | |
| YYMMDD | |
46 | | |
|-----------------------| |
47 | (Reserved) | |
|-----------------------| |
48 | (Garbage) | |
: : |
: : |
| | | ___|=======================|
543| | _V___ / 0 | Tag header page no. |
|-----------------------| | / | |
544| Tag table entries | Tag / | |
| | Table | 3 | |
:.......................: | | |-----------------------| Tag
: : | | 4 | Tag name | table
:.......................: | | : :
: : | / : :
: : | / | |
:.......................:__|_/ 14 | |
: : | |-----------------------|
: : | 15 | Key format *4 |
: : | |-----------------------|
:.......................:__|_ 16 | Forward tag thread (<)|
: : | \ |-----------------------|
: : | \ 17 | Forward tag thread (>)|
: : | \ |-----------------------|
: : | | 18 | Backward tag thread *5|
| | | | |-----------------------|
| | | | 19 | (Reserved) |
M*N| |__V__ | |-----------------------|
|=======================| ^ | 20 | Key type *6 |
0| Pointer to root page | | | |-----------------------|
| | | | 21 | (Reserved) |
| | | | : :
3| | | | : :
|-----------------------| | | 31 | |
4| File size in pages | Tag | |-----------------------|
| | header| 32 | (Garbage) |
| | | | : :
7| | | | | |
|-----------------------| | \ N | |
8| Key format *7 | | \____|=======================|
|-----------------------| |
9| Key type *8 | |
|-----------------------| |
10| (Reserved) | |
| | |
|-----------------------| |
12| Index key length *9 | |
| | |
|-----------------------| |
14| Max.no.of keys/page | |
| | |
|-----------------------| |
16| Secondary key type *10| |
| | |
|-----------------------| |
18| Index key item length | |
| | |
|-----------------------| |
20| (Reserved) | |
| | |
| | |
|-----------------------| |
23| Unique flag | |
|-----------------------| |
| | |
: : |
: :__V__
N*M|=======================|

Resources