SPSS gives out wrong N calculation in crosstabs - spss

I asked SPSS to calculate a crosstab for me, but the N comes out as N=1001, when - looking at the numbers - it should be N=1002?
How could this be possible?
Thank you for your help!
picture of reverenced crosstab

As I haven't seen yor data structure I am not sure if this is what the problem is caused by, but this problem has been reported by IBM before for string variables. In case you have stored the data as a string varaible (eg. "35 bis 54 Jahre", I would recommed recoding it either numerically or into shorter strings (max. 8 characters).
Here is the Link to IBMs page: https://www.ibm.com/support/pages/crosstabs-display-wrong-count-and-totals-large-dataset-and-long-string-variable-table

Related

large integers in cypher, neo4j

I have a dataset with some hexadecimal integers like '4726E440'.
I want to add this numbers as attributes of the nodes.
If I execute:
CREATE (n {id:toInt("4726E440")});
neo4j gives me this error:
integer, 4726E440, is too large
Is there any way to handle this kind of integers (other than saving them as strings)?
Not 100% sure, but this looks like you're trying to convert a string holding a floating point number 4724*10^440 to an int value. That one obviously is too large.
If you want to use hex literals you need to prefix them with 0x, e.g.
return toInt(0x4726E440)
returns 1193731136 - so it's still in range.
If you are wondering what the actual limit for number size in Neo4J is, this forum post might interest you.
Basically, Neo4J uses signed 64bit integers with a maximum of 2**63 - 1. There seems to be no way to increase this limit at the moment, and you will have to resort to strings or byte lists if you really have to store numbers of this size.
Just to build on the other answers, you'll need to wrap your big number in the toInteger() in cypher. The following numbers should not equal one another, but Neo4j thinks they do. (Code was run in Neo4j v4.2, first via the browser interface and then using the python driver):
RETURN 2^63-2 AS Minus2, 2^63-1 AS Minus1, 2^63-2 = 2^63-1 AS Comparison
╒═════════════════════╤═════════════════════╤════════════╕
│"Minus2" │"Minus1" │"Comparison"│
╞═════════════════════╪═════════════════════╪════════════╡
│9223372036854776000.0│9223372036854776000.0│true │
└─────────────────────┴─────────────────────┴────────────┘
But, if you convert the big number to an integer in the statement, Cypher reads it correctly:
RETURN toInteger(2^63)-2 AS Minus2, toInteger(2^63)-1 AS Minus1, toInteger(2^63)-2 = toInteger(2^63)-1 AS Comparison
╒═══════════════════╤═══════════════════╤════════════╕
│"Minus2" │"Minus1" │"Comparison"│
╞═══════════════════╪═══════════════════╪════════════╡
│9223372036854775805│9223372036854775806│false │
└───────────────────┴───────────────────┴────────────┘

How do you include categories with 0 responses in SPSS frequency output?

Is there a way to display response options that have 0 responses in SPSS frequency output? The default is for SPSS to omit in the frequency table output any response option that is not selected by at least a single respondent. I looked for a syntax-driven option to no avail. Thank you in advance for any assistance!
It doesn't show because there is no one single case in the data is with that attribute. So, by forcing a row of zero you'll need to realize we're asking SPSS to do something incorrect.
Having said that, you can introduce a fake case with the missing category. E.g. if you have Orange, Apple, and Pear, but no one answered they like Pear, the add one fake case that says Pear.
Now, make a new weight variable that consists of only 1. But for the Pear case, make it very very small like 0.00001. Then, go to Data > Weight Cases > Weight cases by and put that new weight variable over. Click OK to apply. Now what happens is that SPSS will treat the "1" with a weight of 1 and the fake case with a weight that is 1/10000 of a normal case. If you rerun the frequency you should see the one with zero count shows up.
If you have purchased the Custom Table module you can also do that directly as well, as far as I can tell from their technical document. That module costs 637 to 3630 depending on license type, so probably only worth a try if your institute has it.
So, I'm a noob with SPSS, I (shame on me) have a cracked version of SPSS 22 and if I understood your question correctly, this is my solution:
double click the Frequency table in Output
right click table, select Table Properties
go to General and then uncheck the Hide empty rows and columns option
Hope this helps someone!
If your SPSS version has no Custom Tables installed and you haven't collected money for that module yet then use the following (run this syntax):
*Note: please use variable names up to 8 characters long.
set mxloops 1000. /*in case your list of values is longer than 40
matrix.
get vars /vari= V1 V2 /names= names /miss= omit. /*V1 V2 here is your categorical variable(s)
comp vals= {1,2,3,4,5,99}. /*let this be the list of possible values shared by the variables
comp freq= make(ncol(vals),ncol(vars),0).
loop i= 1 to ncol(vals).
comp freq(i,:)= csum(vars=vals(i)).
end loop.
comp names= {'vals',names}.
print {t(vals),freq} /cnames= names /title 'Frequency'. /*here you are - the frequencies
print {t(vals),freq/nrow(vars)*100} /cnames= names /format f8.2 /title 'Percent'. /*and percents
end matrix.
*If variables have missing values, they are deleted listwise. To include missings, use
get vars /vari= V1 V2 /names= names /miss= -999. /*or other value
*To exclude missings individually from each variable, analyze by separate variables.

SPSS percentile issue

I am working with SPSS 18.
I am using FREQUENCIES to calculate the 95th percentile of a variable.
FREQUENCIES SdrelPromSldDeu_Acr_5_0
/FORMAT=NOTABLE
/PERCENTILES 1,5,95,99.
The result is given in a table
Statistics
SdrelPromSldDeu_Acr_5_0
N Valid 8881
Missing 0
Percentiles 1 -1,001060644014
5 -1,000541440102
95 6619,140632636228
99 9223372,036854776000
But if I double-click the 9223372,036854776 to copy it, another number appears: 1.0757943411193715E7.
If I use MEANS to get the maximum value, the result is 2.4329524990388575E8, so the number that appears on the double-click seems possible.
I have seen 9223372,03 in other cases as well, as if it were some kind of upper limit SPSS is able to display.
Can anybody tell me if the 9223372,03 represents anything useful? Should I trust the bigger number?
Thanks!
It appears to be a bug in the display of SPSS.
The number you have shown is eerily similar to
9223372036854775807
which is the highest value possible if a variable is declared as a long integer.
see also:
https://en.wikipedia.org/wiki/9223372036854775807
Since your actual number is 11 degrees smaller, it should not reach this limit. Hence the conclusion that it must be a bug in the display software.
Do not trust it.
(the number behind may or may not be right, but the 9223372,03 is surely wrong)

Reportbuilder (digital-metaphors) Sum with comma values is not functioning..need Directions?

I am working with firebird 2,5 and Delphi XE and Reportbuilder.
I have a table with a column containing values like 12,15 and 52,63 now in summary band of report i use DBCALC and want to have Total Sum of this column.
The Problem is the result is not correct it comes up 64 instead of 64,78
How can i resolve this problem ? Please help..
Make sure the box in your total line is wide enough to display the complete number. The default might make it a little too small.

What is the MAXIMUM number of variables that can be entered into an SPSS Frequencies command?

As it says on the tin. I know that it will be somewhere between 125 and 2013 but trying to streamline my code.
Any help greatly appreciated!
The DOCS say 1000 is the maximum (see the Limitations section). You could reach the limits of displaying tables in the output in memory, making the effective number smaller (depends on the style of tables you have it output as well as the number of rows displayed in the tables).
This isn't to say that outputting 1,000 frequencies is ever a really good idea. Why would you ever want to visually pour over that many tables?

Resources