This smells very strongly of B.S.
"Data can be ingested from a local memory channel or any wire such as USB, Direct Wire, TCP/IP Packets, Fibre Channel, InfiniBand, iSCSI, etc."
Huh? I was not aware that "TCP/IP Packets" were a way to ingest anything. Somebody needs to go back and fill in the blanks on his OSI chart. And what's "Direct Wire"?
" The conversion process consists of primary data (D’ [prime]) that is dismantled into substrate components called “fractals” and processed into SbM (Symbolic Bit Markers). Unlike other technologies an advanced algorithm allows for substrate fluctuation."
This reeks of something a Star Trek scriptwriter would toss together. It appears to be word-salad that hints at data compression.
"One of the most compelling elements of Bit Marker technology is that it is lossless and does not require any additional overhead, unlike traditional compression schemes. The output of the conversion process is to store, transmit or both depending on use case. This entire conversion process does not require any delayed post-processing and happens in real-time for the end-user.
Symbolic IO refers to the ingestion conversion process as the “constitution” of data; whereas the data being stored or transported has been converted into this proprietary and unique language format and will remain in that state until a recall or read-back/read-request is received from the system. "
Errr... explain to me again, and please use smaller words, how this is different from Real-Time Compression?
Ahh... here it is (well, not really): "Symbolic IO’s patent is based on being a non-compressive algorithm. Compression is a non-deterministic algorithm that requires many CPU cycles and no guaranteed results. By reformatting binary we see consistent results that de-duplication and compression cannot achieve. There is nothing stopping either us or a customer from applying additional compression techniques to further reduce data, if that is the sole primary focus of the end-user, and they were willing to accept the performance penalties associated with standard compression."
BZZZZTTTT!!!! Wrong answer. If you claim you can reduce data size (with "guaranteed" results), for starters, that is, in fact, "compression" in any sense of the word. When you claim you can then further compress the data afterwards? That's the classic compression scam, with a lot of fancy words around it. There IS NO SUCH THING as "guaranteed" compression (this is not a difficult concept to understand or prove); you cannot ever compress random data. Claiming you can further compress it significantly afterwards using another algorithm is just icing on the B.S. cake (all but the lousiest compression algorithms produce data that cannot significantly be further compressed); again, this is classic compression-scam material.