I'm using a stored procedure to update a table when the user clicks approve however I get message that it was approved successfully but the table wasn't updated.
Update
I'm now only able to update one item; even I select more than one item it is only updating one.
Stored Procedure
Update RequisitionItem
set [status] = 0,
[approve_date] = #approve_date
--[ApprovedBy]=#ApprovedBy
where [status]=-1 and Req_No=#reqNumber and item_no=#item_no
Method
public void SetRequisitionStatus0(List <string> reqNumber,List <string> item_no)
{
SqlConnection connection = new SqlConnection(connectionString);
SqlCommand command = new SqlCommand();
command.CommandText = "requisition_sp_setstatus0";
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add("#reqNumber", SqlDbType.VarChar);
command.Parameters.Add("#item_no", SqlDbType.VarChar);
//command.Parameters.Add("#ApprovedBy", SqlDbType.VarChar);
command.Parameters.Add("#approve_date", SqlDbType.DateTime).Value = DateTime.Now;
using (command.Connection = connection)
{
try
{
connection.Open();
foreach (var item in reqNumbers)
{
command.Parameters["#reqNumber"].Value = item;
foreach (var item1 in item_no)
{
command.Parameters["#item_no"].Value = item1;
}
command.ExecuteNonQuery();
}
}
finally
{
connection.Close();
}
}
return;
}
I think the foreach loops are messing you up. The inner loop will loop the entire list (item_no) then go back to the outer loop, then the inner loop will go through the entire list again, always giving you the last item. You need a for loop, something like
for(int i = 0; i < reqNumbers.count; i++)
{
command.Parameters["#reqNumber"].Value = reqNumbers[i];
command.Parameters["#item_no"].Value = item_no[i];
command.ExecuteNonQuery();
}
you don't need to close the connection either - the using statement will close it automatically. and you don't need the final return.
Related
how I can update a single value for an already existing row in the db by only having a parameters that I want to add it to this attribute
here is my code for a trivial way but didnt work
public bool BuyBook(int BookId, int UserId, int BookPrice){
using (var ctx = new OnlineBooksEntities())
{
User updatedCustomer = (from c in ctx.Users
where c.UserId == UserId
select c).FirstOrDefault();
updatedCustomer.Balance = BookPrice;
ctx.SaveChanges();
}
this.DeleteBook(BookId);
return true;
}
Add an sql query to the method solves the update aim
public bool BuyBook(int BookId, int UserId, int BookPrice)
{
try
{
using (var ctx = new OnlineBooksEntities())
{
User user = ctx.Users.Where(x => x.UserId == UserId).FirstOrDefault();
BookPrice = (int)user.Balance + BookPrice;
int noOfRowUpdated =
ctx.Database.ExecuteSqlCommand("Update Users set Balance = "+BookPrice+ " where UserId ="+UserId);
}
Updating basically means changing an existing row's value. Since you mentioned EF, you can do this by retrieving the object, changing its value, and saving it back. Thus you can do something like this:
using (var db = new MyContextDB())
{
var result = db.Books.SingleOrDefault(b => b.BookPrice == bookPrice);
if (result != null)
{
result.SomeValue = "Your new value here";
db.SaveChanges();
}
}
I have a string that is data bytes base64EncodedString from iOS which is an extremely long string
let imageStr = imageData.base64EncodedString()
I am calling a .NET Method from my ios that will call a stored procedure to insert these bytes into the database.
Here is my .NET Method, I have the data type set to VarBinary
public string PostLandGradingImages(List<Images> landingCells)
{
try
{
using (connection = new SqlConnection(connectionString))
{
connection.Open();
using (SqlCommand command = new SqlCommand("PostLandGradingImages", connection))
{
command.CommandType = CommandType.StoredProcedure;
for (int i = 0; i < landingCells.Count; i++)
{
command.Parameters.Clear();
SqlParameter parameter1 = new SqlParameter("#Job_No", SqlDbType.VarChar);
parameter1.Value = landingCells[i].jobNo;
parameter1.Direction = ParameterDirection.Input;
command.Parameters.Add(parameter1);
SqlParameter parameter2 = new SqlParameter("#Image", SqlDbType.VarBinary);
parameter2.Value = landingCells[i].imageBytes;
parameter2.Direction = ParameterDirection.Input;
command.Parameters.Add(parameter2);
command.ExecuteNonQuery();
}
}
}
}
catch (Exception e)
{
return e.Message.ToString();
}
return "All Good";
}
Here is my Image Class, notice my imageBytes is defined as a byte[]:
public class Images
{
public string jobNo { get; set; }
public byte[] imageBytes { get; set; }
}
The column I am inserting into is defined as varbinary(MAX)
and here is my stored procedure:
ALTER PROCEDURE [dbo].[PostLandGradingImages]
-- Add the parameters for the stored procedure here
#Job_No varchar(MAX) = NULL,
#Image varbinary(MAX) = NULL
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
INSERT INTO LandGradingImages (Job_No, ImageBytes) VALUES (#Job_No, #Image)
END
My problem is nothing is getting inserted, I am getting this error in my catch:
Object reference not set to an instance of an object.
My question is, what am I doing wrong? Should I not be sending base64EncodedString or am I not setting my class right? or my db column?
I tried this:
byte[] bytes = System.Convert.FromBase64String(landingCells[i].imageBytes);
SqlParameter parameter2 = new SqlParameter("#Image", SqlDbType.VarBinary, 800000);
parameter2.Value = bytes;
parameter2.Direction = ParameterDirection.Input;
command.Parameters.Add(parameter2);
Still does not work :( and I changed imageBytes to string.
I modified your code a little to the method below. It creates a new CommandType.StoredProcedure for every Image. Also the results are returned per image, so you can see which ones failed. In your method, if you have 10 images, and the 9th failed, you would not know that.
public List<Images> PostLandGradingImages(List<Images> landingCells)
{
//create a connection to the database
using (SqlConnection connection = new SqlConnection(Common.connectionString))
{
//loop all the images
for (int i = 0; i < landingCells.Count; i++)
{
//create a fresh sql command for every Image
using (SqlCommand command = new SqlCommand("PostLandGradingImages", connection))
{
command.CommandType = CommandType.StoredProcedure;
//add the parameters
command.Parameters.Add("#Job_No", SqlDbType.VarChar).Value = landingCells[i].jobNo;
command.Parameters.Add("#Image", SqlDbType.VarBinary).Value = landingCells[i].imageBytes;
try
{
//open the connection if closed
if (connection.State == ConnectionState.Closed)
{
connection.Open();
}
//execute the stored procedure
command.ExecuteNonQuery();
//set the save result to the image
landingCells[i].saveResult = true;
}
catch (Exception ex)
{
//handle error per Image
landingCells[i].errorMessage = ex.Message;
}
}
}
}
return landingCells;
}
In order to track the save result per image I've added two properties to the Image class, but this can be done in various other ways as well.
public class Images
{
public string jobNo { get; set; }
public byte[] imageBytes { get; set; }
public bool saveResult { get; set; }
public string errorMessage { get; set; }
}
A simple test was done with the following code. None of them gave a NullReference Error. Even with both properties being null, a database entry was still made.
//create a new list with Images
List<Images> landingCells = new List<Images>();
//add some dummy data
landingCells.Add(new Images() { jobNo = null, imageBytes = null });
landingCells.Add(new Images() { jobNo = "Job 1", imageBytes = null });
landingCells.Add(new Images() { jobNo = null, imageBytes = new byte[10000] });
landingCells.Add(new Images() { jobNo = "Job 2", imageBytes = new byte[10000] });
//send the images to be saved
landingCells = PostLandGradingImages(landingCells);
//loop all the images to check the result
for (int i = 0; i < landingCells.Count; i++)
{
if (landingCells[i].saveResult == false)
{
//display the result for each failed image
Label1.Text += landingCells[i].errorMessage + "<br>";
}
}
If there is still a NullReference error, that means that your List landingCells itself is null, or an Image object within that List is null (in which case it should never have been added to the List in the first place imho). You can change the snippet easily to check for that.
Consider batching the queries in a transaction. Also you should validate the values provided to the method to make sure that you can call the stored procedure correctly.
public int PostLandGradingImages(List<Images> landingCells) {
int count = 0;
using (var connection = new SqlConnection(connectionString)) {
connection.Open();
//Transaction to batch the actions.
using (var transaction = connection.BeginTransaction()) {
foreach (var image in landingCells) {
if (valid(image)) {//validate input properties.
try {
using (SqlCommand command = connection.CreateCommand()) {
command.CommandType = CommandType.StoredProcedure;
command.CommandText = "PostLandGradingImages";
command.Parameters
.Add("#Job_No", SqlDbType.VarChar, image.jobNo.Length)
.Value = image.jobNo;
command.Parameters
.Add("#Image", SqlDbType.VarBinary, image.imageBytes.Length)
.Value = image.imageBytes;
count += command.ExecuteNonQuery();
}
} catch {
//TODO: Log error
}
}
}
if (landingCells.Count == count) {
transaction.Commit();
}
}
}
return count;
}
private bool valid(Images image) {
return image != null && String.IsNullOrWhiteSpace(image.jobNo)
&& image.imageBytes != null && image.imageBytes.Length > 0;
}
I had a method which fetched records from an sqlite database but after a while i changed it a little and made a secondary method for fetching specific records with user entered information.
I don't know why but my original method is returning null now.
CardDbAdapter:
public CarddbAdapter open2() throws SQLException {
String myPath = DB_PATH + DB_NAME;
myDatabaseR = SQLiteDatabase.openDatabase(myPath, null,
SQLiteDatabase.OPEN_READONLY);
myDatabaseW = SQLiteDatabase.openDatabase(myPath, null,
SQLiteDatabase.OPEN_READWRITE);
return this;
}
public void MyDatabaseClose() {
myDatabaseW.close();
myDatabaseR.close();
}
public ArrayList<String> getAllCardNames() {
ArrayList<String> returnedAllCardNames;
ArrayList<String> NoResults;
ArrayList<String> NoResults2;
NoResults = new ArrayList<String>();
NoResults.add("cursor is null");
NoResults2 = new ArrayList<String>();
NoResults2.add("No similar cards found.");
returnedAllCardNames = new ArrayList<String>();
/*String sqlquery_cardNames = "SELECT " + KEY_CARDNAME
+ " FROM cards WHERE card_name like '%" + passedName
+ "%' ORDER BY card_name ASC";*/
//String sqlquery_cardNames;
String sqlquery_cardNames = "SELECT DISTINCT card_name FROM cards";
Cursor c_cardNames;
c_cardNames = myDatabaseW.rawQuery(sqlquery_cardNames, null);
c_cardNames.moveToFirst();
if (c_cardNames != null) {
do {
if (c_cardNames.getCount() > 0) {
String returnedName = c_cardNames.getString(c_cardNames
.getColumnIndex(KEY_CARDNAME));
returnedAllCardNames.add(returnedName);
} else {
return NoResults2;
}
} while (c_cardNames.moveToNext());
}
return NoResults;
}
How i am using it:
CarddbAdapter yugiohDB = new CarddbAdapter(this);
yugiohDB.open2();
search_results = (ListView) findViewById(R.id.lvSearchResults);
search_results.setOnItemClickListener(this);
ArrayList<String> returnedCards_list1 = new ArrayList<String>();
returnedCards_list1.addAll(yugiohDB.getAllCardNames());
listAdapter = new ArrayAdapter<String>(SearchMode_Simple.this,
android.R.layout.simple_list_item_1, returnedCards_list1);
search_results.setAdapter(listAdapter);
yugiohDB.MyDatabaseClose();
Any help would be appreciated.
If you would like to see what this is actually supposed to do then download my app called Yugioh Library + Tools. Click the Search Library button from the main menu and then the button Simple Search. It should be displaying a list of cards from the database.
The reason i was changing it is because i'm setting up Spinners so users can choose different trading card sets to choose some which would then list all the cards from that specific set.
Fixed. I forgot to return the string array "returnedAllCardNames" which was holding the names of all process card records after the do while loop.
An error occurs when I call the function shown below:
Store update, insert, or delete statement affected an unexpected number of rows (0). Entities may have been modified or deleted since entities were loaded. Refresh ObjectStateManager entries."
Function:
[HttpPost]
public ActionResult Index(InsertPo model)
{
var context = new UsersContext();
var po = new Po();
var user = new User();
po.PoId = 12;
po.PoNumber = model.Po.PoNumber;
po.Style = model.Po.Style;
po.Quantity = model.Po.Quantity;
po.Status = "hhh";
po.OrderDate = Convert.ToDateTime("30-12-2011");
po.ShipmentDate = Convert.ToDateTime("2-12-2011");
po.ProductionRate = 10;
po.UserId = 2;
/*buyer.BuyerName = model.Buyer.BuyerName;*/
/* buyer.BuyerId = 1;
buyer.PoId = 10;*/
context.Pos.Add(po);
context.SaveChanges();
return RedirectToAction("Index");
}
Try putting this line outside of your Action method.
var context = new UsersContext();
I'm using SQLBULKCOPY to copy some data-tables into a database table, however, because the size of the files I'm copying run sometimes in excess of 600mb, I keep running out of memory.
I'm hoping to get some advice about managing the table size before I commit it to the database so I can free up some memory to continue writing.
Here are some examples of my code (some columns and rows eliminated for simplicity)
SqlBulkCopy sqlbulkCopy = new SqlBulkCopy(ServerConfiguration); //Define the Server Configuration
System.IO.StreamReader rdr = new System.IO.StreamReader(fileName);
Console.WriteLine("Counting number of lines...");
Console.WriteLine("{0}, Contains: {1} Lines", fileName, countLines(fileName));
DataTable dt = new DataTable();
sqlbulkCopy.DestinationTableName = "[dbo].[buy.com]"; //You need to define the target table name where the data will be copied
dt.Columns.Add("PROGRAMNAME");
dt.Columns.Add("PROGRAMURL");
dt.Columns.Add("CATALOGNAME");
string inputLine = "";
DataRow row; //Declare a row, which will be added to the above data table
while ((inputLine = rdr.ReadLine()) != null) //Read while the line is not null
{
i = 0;
string[] arr;
Console.Write("\rWriting Line: {0}", k);
arr = inputLine.Split('\t'); //splitting the line which was read by the stream reader object (tab delimited)
row = dt.NewRow();
row["PROGRAMNAME"] = arr[i++];
row["PROGRAMURL"] = arr[i++];
row["CATALOGNAME"] = arr[i++];
row["LASTUPDATED"] = arr[i++];
row["NAME"] = arr[i++];
dt.Rows.Add(row);
k++;
}
// Set the timeout, 600 secons (10 minutes) given table size--damn that's a lota hooch
sqlbulkCopy.BulkCopyTimeout = 600;
try
{
sqlbulkCopy.WriteToServer(dt);
}
catch (Exception e)
{
Console.WriteLine(e);
}
sqlbulkCopy.Close();//Release the resources
dt.Dispose();
Console.WriteLine("\nDB Table Written: \"{0}\" \n\n", sqlbulkCopy.DestinationTableName.ToString());
}
I continued to have problems getting SQLBulkCopy to work, and I realized I needed to do more work on each record before it was entered into the database, so I developed a simple LinQ to Sql method to do record by record updates, so I could edit other information and create more record information as it was being run,
Problem: This method's been running pretty slow (even on Core i3 machine), any ideas on how to speed it up (threading?) -- on a single processor core, with 1gb of memory it crashes or takes sometimes 6-8 hours to write the same amount of data as one SQLBulkCopy that takes a few moments. It does manage memory better though.
while ((inputLine = rdr.ReadLine()) != null) //Read while the line is not null
{
Console.Write("\rWriting Line: {0}", k);
string[] arr;
arr = inputLine.Split('\t');
/* items */
if (fileName.Contains(",,"))
{
Item = Table(arr);
table.tables.InsertOnSubmit(Item);
/* Check to see if the item is in the db */
bool exists = table.tables.Where(u => u.ProductID == Item.ProductID).Any();
/* Commit */
if (!exists)
{
try
{
table.SubmitChanges();
}
catch (Exception e)
{
Console.WriteLine(e);
// Make some adjustments.
// ...
// Try again.
table.SubmitChanges();
}
}
}
With helper method:
public static class extensionMethods
{
/// <summary>
/// Method that provides the T-SQL EXISTS call for any IQueryable (thus extending Linq).
/// </summary>
/// <remarks>Returns whether or not the predicate conditions exists at least one time.</remarks>
public static bool Exists<TSource>(this IQueryable<TSource> source, Expression<Func<TSource, bool>> predicate)
{
return source.Where(predicate).Any();
}
}
Try specifying the BatchSize property to 1000 which will batch up the insert in a 1000 record batch rather than the whole lot. You can tweak this value to find what is optimal. I have used sqlbulkcopy for similar size data and it works well.
Faced with the same issue, found that problem of OutOfMemory Exception was in DataTable.Rows maximum quantity limitations.
Solved with recreating table, with maximum 500000 rows limit.
Hope, my solution will be helpfull:
var myTable = new System.Data.DataTable();
myTable.Columns.Add("Guid", typeof(Guid));
myTable.Columns.Add("Name", typeof(string));
int counter = 0;
foreach (var row in rows)
{
++counter;
if (counter < 500000)
{
myTable.Rows.Add(
new object[]
{
row.Value.Guid,
row.Value.Name
});
}
else
{
using (var dbConnection = new SqlConnection("Source=localhost;..."))
{
dbConnection.Open();
using (var s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = "MyTable";
foreach (var column in myTable.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
try
{
s.WriteToServer(myTable);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
s.Close();
}
}
}
myTable = new System.Data.DataTable();
myTable.Columns.Add("Guid", typeof(Guid));
myTable.Columns.Add("Name", typeof(string));
myTable.Rows.Add(
new object[]
{
row.Value.Guid,
row.Value.Name
});
counter = 0;
}
}