I'm using delphi to create a database in MS Access, but when I click the button to add the tables to the database it flags the syntax as being incorrect.
cs:='CREATE TABLE tblRecordOfGames ('+
'Username Varchar CONSTRAINT FK_Username '+
'REFERENCES tblUsers (Username),'+
'TimeOfGame Date/Time,'+
'MovesTaken Integer(3)'+
'OptimalMoves Integer(3)'+
'PercentageofOptimalMoves Double(5)'+
'CreditsWon Integer'+
'CreditsLost Integer)';
ADOCommand1.CommandText:=cs;
ADOCommand1.Execute;
I think it's an issue with the way I'm trying to add the key but I've been having a hard time finding a working example.
Several of your data types are wrong, and you are missing several commas.
Setting a field size / precision is only valid for types TEXT and DECIMAL.
See http://allenbrowne.com/ser-49.html (the DDL column).
Your CONSTRAINT is valid, though. :)
Try this:
CREATE TABLE tblRecordOfGames (
Username Text(50) CONSTRAINT FK_Username REFERENCES tblUsers (Username),
TimeOfGame DateTime,
MovesTaken Integer,
OptimalMoves Integer,
PercentageofOptimalMoves Double,
CreditsWon Integer,
CreditsLost Integer)
Note: you should add a Primary Key.
Related
For the following table:
I run the following stored procedure:
I'm redirected to "Results" tab and seeing nothing. Then if I click on "refresh" icon (below Results tab), then I get the dialog saying:
SQLCODE = -625 validation error for column ID, value "* null *"
And of course, nothing is added...
As far as I understand, firebird expects somevalue for RC_ID (which is my PK and should principally automatically incremented). If I give value also for RC_ID, it is working well.
So, what should I do to make a clear "insert" without these errors?
The problem is that you are not setting a value for the primary key. Contrary to your expectation, primary keys are not automatically incremented. This is the case in any database I'm aware of. You always need to mark it as an identity, auto increment or generated, or something else to get that behavior, although some tools (table builders) may already apply this for you by default.
If you are using Firebird 3, you can define your column as GENERATED BY DEFAULT AS IDENTITY (see Identity Column Type in the Firebird 3 release notes). For earlier Firebird versions the best way is to define a sequence (also known as generator) with a before insert trigger that populates the primary key column.
For more details on how to define an identity column (or define the trigger), see my answer on this question: Easiest way to create an auto increment field in Firebird database.
In firebird, the autoincrement was not working like in MySQL. Thus, sending a value for RC_ID was a must...
I found some working examples based on idea:
create a generator
assign it to column (PK)
call GEN_ID with that generator like this:
:
begin
insert into RESERVATIONCATEGORY (RC_ID, RC_NAAM)
values (
GEN_ID(GEN_RESERVATIONCATEGORY _ID,1), 'selam'
);
suspend;
end
I have values in a table called Translation, that contains for every values per example:
=> {"fr"=>"Jaune", "de"=>"", "en"=>"", "bg"=>"", "hr"=>"", "es"=>"", "hu"=>"", "it"=>"", "lt"=>"", "lv"=>"", "nl"=>"", "pl"=>"", "pt"=>"", "ro"=>"", "cs"=>""}
and I'm looking to get the the number of the translation for each language:
I'm trying :
Translation.where("values->>'fr' IS NOT NULL").count
but it giving me 0, which is not correct, do anyone knows how to do it correctly?
The problem that you have is that the keys that don't have values, still exist in the json, so "is not null" will return all of them because the key exist. you have two options here, you can remove the empty keys from the database, so not null will work, or, change it to see if the key is empty
You can do it like this
Translation.where("values->>'fr' <> ''").count
and it will work with the structure that you have right now.
Using Delphi 10.2, SQLite and Teecharts. My SQLite database has two fields, created with:
CREATE TABLE HistoryRuntime ('DayTime' DateTime, Device1 INTEGER DEFAULT (0));
I access the table using a TFDQuery called qryGrpahRuntime with the following SQL:
SELECT DayTime AS TheDate, Sum(Device1) As DeviceTotal
FROM HistoryRuntime
WHERE (DayTime >= "2017-06-01") and (DayTime <= "2017-06-26")
Group by Date(DayTime)
Using the Field Editor in the Delphi IDE, I can add two persistent fields, getting TheDate as a TDateTimeField and DeviceTotal as a TLargeIntField.
I run this query in a program to create a TeeChart, which I created at design time. As long as the query returns some records, all this works. However, if there are no records for the requested dates, I get an EDatabaseError exception with the message:
qryGrpahRuntime: Type mismatch for field 'DeviceTotal', expecting: LargeInt actual: Widestring
I have done plenty of searching for solutions on the web on how to prevent this error on an empty query, but have had not luck with anything I found. From what I can tell, SQLite defaults to the wide string field when no data is returned. I have tried using CAST in the query and it did not seem to make any difference.
If I remove the persistent fields, the query will open without problems on an empty return set. However, in order to use the TeeChart editor in the IDE, it appears I need persistent fields.
Is there a way I can make this work with persistent fields, or am I going to have to throw out the persistent fields and then add the TeeChart Series at runtime?
This behavior is described in Adjusting FireDAC Mapping chapter of the FireDAC's SQLite manual:
For an expression in a SELECT list, SQLite avoids type name
information. When the result set is not empty, FireDAC uses the value
data types from the first record. When empty, FireDAC describes those
columns as dtWideString. To explicitly specify the column data type,
append ::<type name> to the column alias:
SELECT count(*) as "cnt::INT" FROM mytab
So modify your command e.g. this way (I used BIGINT, but you can use any pseudo data type that maps to a 64-bit signed integer data type and is not auto incrementing, which corresponds to your persistent TLargeIntField field):
SELECT
DayTime AS "TheDate",
Sum(Device1) AS "DeviceTotal::BIGINT"
FROM
HistoryRuntime
WHERE
DayTime BETWEEN {d 2017-06-01} AND {d 2017-06-26}
GROUP BY
Date(DayTime)
P.S. I did a small optimization by using BETWEEN operator (which evaluates the column value only once), and used an escape sequence for date constants (which, in real you replace by parameter, I guess; so just for curiosity).
This data type hinting is parsed by the FDSQLiteTypeName2ADDataType procedure that takes and parses column name in format <column name>::<type name> in its AColName parameter.
I have three rails objects: User, DemoUser and Stats. Both the User and the DemoUser have many stats associated with them. The User and Stats tables are stored on Postgresql (using ActiveRecord). The DemoUser is stored in redis. The id for the DemoUser is a (random) string. The id for the User is a (standard-rails) incrementing integer.
The stats table has a user_id column that can contain either the User id or the DemoUser id. For that reason, the user_id column is a string, rather than an integer.
There isn't an easy way to translate from the random string to an integer, but there's a very easy way to translate the integer id to a string (42 -> "42"). The ids are guaranteed not to overlap (there won't be a User instance with the same id as a DemoUser, ever).
I have some code that manages those stats. I'd like to be able to pass over a some_user instance (which can either be a DemoUser or a User) and then be able to use the id to fetch Stats, update them etc. Also would be nice to be able to define a has_many for the User model, so I can do things like user.stats
However, operations like user.stats would create a query like
SELECT "stats".* FROM "stats" WHERE "stats"."user_id" = 42
which then breaks with PG::UndefinedFunction: ERROR: operator does not exist: character varying = integer
Is there a way to either let the database (Postgresql), or Rails do auto-translation of the ids on JOIN? (the translation from integer to string should be simple, e.g. 42 -> "42")
EDIT: updated the question to try to make things as clear as possible. Happy to accept edits or answer questions to clarify anything.
You can't define a foreign key between two types that don't have built-in equality operators.
The correct solution is to change the string column to be an integer.
In your case you could create a user-defined = operator for varchar = string, but that would have messy side effects elsewhere in the database; for example, it would allow bogus code like:
SELECT 2014-01-02 = '2014-01-02'
to run without an error. So I'm not going to give you the code to do that. If you truly feel it's the only solution (which I don't think is likely to be correct) then see CREATE OPERATOR and CREATE FUNCTION.
One option would be to have separate user_id and demo_user_id columns in your stats table. The user_id would be an integer that you could use as a foreign key to the users table in PostgreSQL and the demo_user_id would be a string that would link to your Redis database. If you wanted to treat the database properly, you'd use a real FK to link stats.user_id to users.id to ensure referential integrity and you'd include a CHECK constraint to ensure that exactly one of stats.user_id and stats.demo_user_id was NULL:
check (user_id is null <> demo_user_id is null)
You'll have to fight ActiveRecord a bit to properly constrain your database of course, AR doesn't believe in fancy things like FKs and CHECKs even though they are necessary for data integrity. You'd have to keep demo_user_id under control by hand though, some sort of periodic scan to make sure they link up with values in Redis would be a good idea.
Now your User can look up stats using a standard association to the stats.user_id column and your DemoUser can use stats.demo_user_id.
For the time being, my 'solution' is not to use a has_many in Rails, but I can define some helper functions in the models if necessary. e.g.
class User < ActiveRecord::Base
# ...
def stats
Stats.where(user_id: self.id.to_s)
end
# ...
end
also, I would define some helper scopes to help enforce the to_s translation
class Stats < ActiveRecord::Base
scope :for_user_id, -> (id) { where(user_id: id.to_s) }
# ...
end
This should allow calls like
user.stats and Stats.for_user_id(user.id)
I think I misunderstood a detail of your issue before because it was buried in the comments.
(I strongly suggest editing your question to clarify points when comments show that there's something confusing/incomplete in the question).
You seem to want a foreign key from an integer column to a string column because the string column might be an integer, or might be some unrelated string. That's why you can't make it an integer column - it's not necessarily a valid number value, it might be a textual key from a different system.
The typical solution in this case would be to have a synthetic primary key and two UNIQUE constraints instead, one for keys from each system, plus a CHECK constraint preventing both from being set. E.g.
CREATE TABLE my_referenced_table (
id serial,
system1_key integer,
system2_key varchar,
CONSTRAINT exactly_one_key_must_be_set
CHECK (system1_key IS NULL != system2_key IS NULL),
UNIQUE(system1_key),
UNIQUE(system2_key),
PRIMARY KEY (id),
... other values ...
);
You can then have a foreign key referencing system1_key from your integer-keyed table.
It's not perfect, as it doesn't prevent the same value appearing in two different rows, one for system1_key and one for system2_key.
So an alternative might be:
CREATE TABLE my_referenced_table (
the_key varchar primary key,
the_key_ifinteger integer,
CONSTRAINT integerkey_must_equal_key_if_set
CHECK (the_key_ifinteger IS NULL OR (the_key_ifinteger::varchar = the_key)),
UNIQUE(the_key_ifinteger),
... other values ...
);
CREATE OR REPLACE FUNCTION my_referenced_table_copy_int_key()
RETURNS trigger LANGUAGE plpgsql STRICT
AS $$
BEGIN
IF NEW.the_key ~ '^[\d]+$' THEN
NEW.the_key_ifinteger := CAST(NEW.the_key AS integer);
END IF;
RETURN NEW;
END;
$$;
CREATE TRIGGER copy_int_key
BEFORE INSERT OR UPDATE ON my_referenced_table
FOR EACH ROW EXECUTE PROCEDURE my_referenced_table_copy_int_key();
which copies the integer value if it's an integer, so you can reference it.
All in all though I think the whole idea is a bit iffy.
I think I may have a solution for your problem, but maybe not a massively better one:
class User < ActiveRecord::Base
has_many :stats, primary_key: "id_s"
def id_s
read_attribute(:id).to_s
end
end
Still uses a second virtual column, but maybe more handy to use with Rails associations and is database agnostic.
I want to make a copy of every record inserted in jobact to a new table jobactupdates. I am using a stored procedure for this purpose. Both tables are exactly the same and have same no of columns. When I insert data in jobact using insert query then, the stored procedure fails and show the Data Type mismatch error.
My code looks like this:
PROCEDURE insertData
INSERT INTO jobactupdates (jobcode ,jobdescr ,fileno ,port ,mastcode ,mastdescr ,mastdescr1 ,shipper ,goods ,unit1 ,qty ,unit ,vesname ,arremarks ,arrdate ,remarks ,docstat ,docdate ,blno ,bldate ,jastat ,rate ,demand ,received ,balance ,transpor,dldate);
VALUES(jobact.jobcode,jobact.jobdescr,jobact.fileno,jobact.port,jobact.mastcode,jobact.mastdescr,jobact.mastdescr1,jobact.shipper,jobact.goods,jobact.unit1,jobact.qty,jobact.unit,jobact.vesname,jobact.arremarks,jobact.arrdate,jobact.remarks,jobact.docstat,jobact.docdate,jobact.blno,jobact.bldate,jobact.jastat,jobact.rate,jobact.received,jobact.balance,jobact.transpor,jobact.dldate);
ENDPROC
A Data Type Mismatch error occurs when you try to insert an inappropriate data type into a field. For example, if you try to store a string into an integer field. I would double check the table structures and confirm that they are identical.
Another thing to be aware of is if any of the JOBACT field types are set to Integer (AutoInc). They will have to be set to just Integer in the JOBACTUPDATES table. Otherwise you will get a "Field is read-only" error message.
For
Character fields: write them into '' marks,
Numeric fields: just numbers for example 123,
Date fields: {^yyyy-mm-dd}
(There can also optionally be time in Date field.)
Is this your actual query? If so, the fact your Columns and Values clauses contain different field lists has certainly caused this error:
Insert Into ...
bldate,
jastat,
rate,
demand,
received,
balance ..
Values ...
jobact.bldate,
jobact.jastat,
jobact.rate,
jobact.received, <--
jobact.balance, <--
jobact.transpor <--
.