$32K \times 1$ RAM
Is specifying no unit a new standard? or "$\times 1$" is a new notation for "bits"? Any link / wiki explaining this new notation?
To me "$32K\times 1$ RAM" looks like a printing mistake. How one is supposed to decode it straight that it means 32Kbits without having any doubt?