2

I have a class like below,

class Student
{
    public string Name {get; set;}
    public string Surname {get; set;}
    public int Age {get; set;}
    public string Address {get; set;}
}

And I have a MySql table with 50,000 records. The Structure of the table is like below,

ID        NAME       SURNAME        AGE          ADDRESS
1         Joe        Philip         20           Moscow
2         Misha      Johny          25           London
...

And I have a C# code,

List<Student> students = new List<Student>();
string sql = "SELECT name,surname,age,address FROM Students";
command.CommandText = sql;
MySqlDataReader reader = command.ExecuteReader();
while(reader.Read())
{
    Student st = new Student();
    st.Name = reader["Name"].ToString();
    st.Surname = reader["Surname"].ToString();
    st.Age = Convert.ToInt32(reader["Age"].ToString());
    st.Address = reader["Address"].ToString();
    students.Add(st);
}

But it works very slow. Which method do you advice to make this code run faster?

UPDATE:

When I use this code,

DataTable dt = new DataTable();
adapter.Fill(dt);

It works very well and speed is very normal. But what is the problem I try it with my own classes?

7
  • 7
    Of course it is slow. Don't read 50k records into memory all at once! Commented Nov 11, 2011 at 17:55
  • 3
    This is as fast as it will probably get, there's no obvious enhancement or oversight here - do you need all of these records in memory? Commented Nov 11, 2011 at 17:55
  • Out of curiosity, why do you need to read the whole database table into the memory at all? It's the general idea of databases that they can potentially keep much more data than can fit into memory, and that we ought to process the data record by record. Commented Nov 11, 2011 at 17:57
  • See my answer; you'll probably get better answers if you tell us what you're going to do with this student data. Commented Nov 11, 2011 at 17:57
  • 2
    That code does not compile. You are assigning a string to an int property (age). Commented Nov 11, 2011 at 17:58

2 Answers 2

5

If the code runs slowly, the largest cause is that there are 50,000 records. What exactly do you need with 50,000 Student objects? If you can find a way to solve your problem without reading all of those records and creating all of those objects, you'll have faster code.

Update

Using your own class is fine. Most of the time when things run slow, it's because your code is I/O bound (you spend most of your time waiting for I/O). To avoid all that I/O, you can reduce the amount of data you retrieve (perhaps by eliminating irrelevant columns or rows from your data) or doing your processing on the database through a more complex query or stored procedure.

Update 2

To answer your follow-up question (why creating a list of objects is slower than getting a DataSet), I would expect that reading the entire query as a DataSet would only be slightly faster than object creation. I'm not familiar with how that MySQL .NET library is implemented. It is surprising to me that the two methods would have a large difference in speed. Maybe MySqlDataReader is doing something dumb like using an internal DataSet. If the performance is drastically different between the two, it's probably something the author of that library should fix.

Update 3

This answer for MySqlDataAdapter or MySqlDataReader for bulk transfer? has a good tip; setting the BatchSize of the reader may be helpful. If the batch size is too small for the reader, that would make it less efficient with a large number of records like yours.

Sign up to request clarification or add additional context in comments.

3 Comments

i write exam checking program. i will use this and another data which i didnt show here for determining points of students. I can use DataSet here instead of Student class. But i prefer to use my own class. There isnt any problem with RAM. Problem only in filling List with Data. it works very slow.
Without knowing the specifics of what kind of processing you're doing on the data, we can't give much more help. Faster code is code that does less, so try to find a way to make your code do less (offloading some processing to the database for example). See my update.
1

Using index of record instead of coulmname make the performance a bit better use

st.Name = reader[0].ToString();
instead of
st.Name = reader["Name"].ToString();

and 
st.Name = reader[0].ToString();
instead of 
st.Name = reader["surname"].ToString();

1 Comment

Also note that the string indexing needs to be of proper case sensitivity, otherwise it will be terribly slow. MySQL ADO.NET implementation is weird in this case. And its only for MySQL ADO, every other connector I tested worked fine.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.