Support Forums

Full Version: how to open a 750 mb database,sql file?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
Just load it in notepad, and go get a coffee and a few cigarettes. It should be already opened after a few minutes.
(08-21-2011, 01:09 AM)WhosYourDaddy Wrote: [ -> ]Just load it in notepad, and go get a coffee and a few cigarettes. It should be already opened after a few minutes.

A 750Mb file would be way too much for notepad to handle. Even if you did get it loaded without crashing notepad, adding or modifying it would take long wait's as well, assuming it didn't freeze at the point of time when you modify it.

Even Notepad++ doesn't support a file that large
He didn't specified what he want to do with it.
But as you said, of course that Notepad won't be good for this operation, so the best option will be to import it in a reliable MySQL server.
(08-21-2011, 01:41 AM)WhosYourDaddy Wrote: [ -> ]He didn't specified what he want to do with it.
But as you said, of course that Notepad won't be good for this operation, so the best option will be to import it in a reliable MySQL server.

I'd assume he want's to open it in a txt view though since he's trying text editors, but if you Import it's data into a MySQL database, you'd be able to read the information pretty easily from it with PhpMyAdmin.

However a solution for a text view would be to open it's text contents in a web browser, Open with.. > Firefox or some other web browser, it will take a bit to load all the information, but it will load information in chunks allowing you to have a better chance of seeing all the information from it. I would recommend closing all other open windows though, as it could be a little resource intensive.
Yeah, that's what I am suggesting to find a way and import it in to the MySQL server, so he will be able to check its content via phpMyAdmin easily without any troubles.
That can be quite a bit of work though if he doesn't have a MySQL database to dump the info into though, which is the only issue with doing it that way, but seeing at the options are limited with text editors, there's not too many options left over.
(08-21-2011, 02:56 AM)Ace Wrote: [ -> ]That can be quite a bit of work though if he doesn't have a MySQL database to dump the info into though, which is the only issue with doing it that way, but seeing at the options are limited with text editors, there's not too many options left over.

It won't be easy to import it nor open it with any text editor, but if he wants to check it out, he will need to take any action.
(08-21-2011, 03:13 AM)WhosYourDaddy Wrote: [ -> ]It won't be easy to import it nor open it with any text editor, but if he wants to check it out, he will need to take any action.

Notepad++ won't even actually load a file that large though, it will reject it entirely with a message box prompting you that the file is too large. So I personally don't think you can do anything with text editors. Unless you try an editor like UltraEdit, which is another advanced editor i've had in the past, but it's not free, and I didn't find it to be as good as Notepad++
I set up a powershell script for you to test out, this should get the contents after a little while of waiting.

Code:
# -- Script Snippet Created By Ace
cls;$_Output = [System.IO.File]::ReadAllLines("PLACE FULL FILEPATH HERE")
write-host $_Output -foregroundcolor "green"
clear-variable _Output

Make sure to replace "PLACE FULL FILEPATH HERE" with the filepath of the file you want the contents of.

Get-Content was throwing OutOfMemory exceptions for me with a file this large, so I changed it to use the ReadAllLines function in the System.IO namespace for the .net method because I believe it's faster and more efficient. I couldn't just pipeline the information through because that seemed to throw exceptions as well.

If I develop this more I might be able to read and output the first 10 000 bytes to a txt file before clearing the variable and continuing with reading the base input file. That way you'll end up with the original file, outputted into numerous txt files, each a separate portion of the original combined file, so that you don't crash an editor like notepad when you try to read the content in smaller pieces.



Update:
Code:
# -- Script Snippet Created By Ace
#Testfile

cls;$_Output = [System.IO.File]::ReadAllLines("C:\Windows\System32\Drivers\ETC\HOSTS")
foreach ($Ln in $_Output) {
    write-host $Ln -foregroundcolor "magenta"
}
clear-variable _Output



Update 2:
Code:
# -- Script Snippet Created By Ace
#Testfile 2

$InputPath = "FILEPATH TO THE FILE YOU WANT TO READ"
$OutputPath = "C:\OUTPUT_.txt"

cls;$_Output = [System.IO.File]::ReadAllLines($InputPath);$i = 1
foreach ($Ln in $_Output) {
    If ($i -eq 1001) {
        foreach ($Ln in $_Output) {
            $Ln >> $OutputPath
        }
    break
    }
    write-host $Ln -foregroundcolor "magenta";$i += 1
}
clear-variable _Output

Something like this will take the first approx. 1000 lines and output it to a text file in C:\OUTPUT_.txt from the file you read. Just edit the Input and Output variables to the filepaths you want them to read, and output to.
Pages: 1 2