New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
Oh boy...
Varchar is variable in length. Everytime mysql updates the table, it checks the size, updates the size, updates the content, rearrange the next varchar's position...
Varchar is great, I also use varchar. But not more than one or two per table. And definitely not 200+ per table.
Varchar is also not wise to use for anything larger than 255 in size. You declared them all as 512.
What do you store in those fields? Are they text answers, or simple A-B-C-D-E options?
No. Sometimes the answers is a long essay answers. Otherwise I am going to use char or int.
Then please get some books related to relational databases. What you have here is not optimized or even close. This isn't something someone on LET is going to be able to resolve. Like a potential code issue.... This is very different.
Please read what I wrote, the link is important.
It's the practical explanation of what Harzem is speaking of.Not sure why I thought Harzem said this.Yeah. Thanks for the link. Read it definitely I will.
@MCHPhil's "normalization" idea is the best approach here. You should learn about how to divide your databases.
If you want a quicker fix, you can implement my idea of storing the answers in a single column. You can even keep them as an array, serialize before saving, unserialize after reading. This will get you a performance boost, but the records will not be searchable or indexable. In the future, if you decide to implement queries like "who answered 10th question correctly?", you will wish you had normalized your database structure.
Additionally, if you are storing user-entered test, make sure you are properly implementing input validation. Otherwise people will hack your database on the first day.
Also, don't make your "starttime" a varchar. Either use INT and put in unixtime codes, or use DATETIME and let mysql handle the date conversion properly.
Maybe start with a smaller project. Less intertwined data may help you understand the complexities of the database.
EXPLAIN is a very powerful tool. You can prepend it to any statement to see what happens internally. It will even tell you what kind of joins are happening, tmp tables or in memory etc. Very useful tool.
https://www.amazon.com/MySQL-Pocket-Reference-Statements-Functions/dp/0596514263/ref=sr_1_4?ie=UTF8&qid=1512920187&sr=8-4&keywords=oreilly+mysql
I have an older version of this book and it was what I used when I was still learning the intricate side of things. YMMV.
There are also a lot of "test/quiz" taking tools out there. May be beneficial to utilize something mainstream while building your own. Honestly, this isn't like a hello world. As I'm sure you know
Or as @harzem said, throw it all in 1 column and serialize. Will be tons faster to manage everything in PHP than in SQL.
You are in way over your head.
LOL I've had some good luck with AWS Aurora handling bad databases relatively well. Just toss it over there and call it a cloud app. You cloud engineer!
There is none like I want. The original purpose of this app is to be used offline on LAN. So, I don't really put much into thinking and or optimize it. As long as it can handle 40 students. It's just a recently (3 years later) that it comes the idea to put it all online with very limited resources.
Just serialize the array of answers and put it in one text field in MySQL and be done with it then.
Couldn't agree more.
I have a feeling this is going to go down hard. Hopefully I am wrong this time.
When you update, if you are updating one answer at a time, it's gonna be slower than grandma crossing the highway.
What you can do instead is remove the 200 columns, add a new int column called "questionNumber" which will hold the question/answer number, 1 column called "no" and 1 column called "score".
That way, you will remove 197 columns and add 99 rows per user. It should be way faster if you update/insert one question at a time. Plus unlike json/serialized, you will be able to search for answer and sort by score etc...
Another approach would be to store it as JSON during peak time and rebuild it all at night in your current table. It might not be as efficient, but it's a way.
Also, make id unsigned, it wont have a negative value like ever, timestamp and any date should be declared datetime, or unsigned bigint if you wanna store a timestamp, either way it's better than varchar
FYI, there is a MEMORY storage engine which is included by default.
https://dev.mysql.com/doc/refman/5.7/en/memory-storage-engine.html
You could copy your tables to MEMORY tables and then do whatever analysis you want in memory.
Use a ram disk:
https://www.kiloroot.com/create-a-ram-disk-in-linux-tell-mysql-to-put-stuff-there/
Write your code with a buffer in mind and it won't be risky, unless you're not worried about the temp files being cleared out possibly.
Increment of number of rows is not a problem. A VPS with 2GB of memory can easily handle 1 million short rows. Try normalizing your table so that each row has a student ID and a question ID with the answer. Should make you much happier.
Since this current design is the rewrite, I seriously question how bad the last design was. Jebus.
I think it'd be better to use the MEMORY storage engine. What you're doing there is essentially tricking MySQL into thinking it's on a disk. I'm sure it'll work fine, but perhaps there are optimizations that MySQL invokes if it knows it's working with RAM storage, or perhaps it does stats or diagnostics differently, etc.
OTOH...I don't have a link handy, but if the OP really wants to fly, on Linux it's possible with some video cards to use their graphic memory as swap or a ramdisk.
I haven't touched the database design. I only changed the php code. and going from 100% of cpu usage to 10% of cpu usage is pretty good change I think.
The DB is pretty damn bad, but people are all but telling you it's wrong.
This should use very very very very very very very very, I can't say very enough, very minimal cpu usage. I do suggest just biting the bullet and using one of the bajillion ready made scripts that would do the same thing. You can host them on a LAN. Not sure why that would disqualify anything.
Don't you think it will take some times to find the record to update among 800.000 row of answers ? assuming there are 8.000 students and each students has 100 row of answers ?
tweak your query
My database design is a mess. The query is fine.
..and for god's sake, use several columns and join on those.
User table with int for user id is user id for answers, et al..
ok, here is what i gonna do :
instead of using using 100++ column for the answers, I am going to put all the answers on 1 column in a serialized string. When a student send their answer, I will get that string from the database, insert the answer into the string, and update it to the database.
I think I will go for varchar (10K). Is that better ?
Jesus fuck why the hell are you even using a database at this point?
Welcome to indexing.
You're in over your head.
No.