Very little guidance in my current project. It seems like I have been dropped into an ocean when I was still making myself comfortable in a swimming pool. I have a very large data set approximately 9x10^8 entries and as each of them is a 64 bit double it takes up a lot of space = 7.2 GB, it is impossible to fit into the memory given all the other variables that I want to store. I tried using float but that gave me strange results.
Another concern is that the data is very sparse. In a matrix of size 9million X 3 million, I have only 14 million non zero entries, now most of the rows have nearly one or 2 non zero entries. On an average every row has less than 2 non zero entries. Need to think
Project Ideas : Old and new
1. Application to aggregate donations for humanitarian cause
2. Matter: Crowd sourced news platform
3.
Another concern is that the data is very sparse. In a matrix of size 9million X 3 million, I have only 14 million non zero entries, now most of the rows have nearly one or 2 non zero entries. On an average every row has less than 2 non zero entries. Need to think
Project Ideas : Old and new
1. Application to aggregate donations for humanitarian cause
2. Matter: Crowd sourced news platform
3.