How to Export SQL Developer Query Results Without Re-Running the Query

thatjeffsmith SQL Developer 57 Comments

Tell Others About This Story:

You run a query. You wait patiently for your query and results to return. You want the results in Excel. You ask SQL Developer to ‘Export.’ SQL Developer seems to start over by running the query again. You are no longer waiting patiently.

No, no, no!

How can you avoid this?

The solution is to ensure all of the data has been fetched into the grid before you ask for the Export.

What the heck is he talking about, fetched?

Running a query in Oracle has several distinct steps.

You submit your query.

The database engine parses the query and identifies an execution plan. If it doesn’t already have one cached, it will build a new one. Hopefully this shouldn’t take but a millisecond or two.

The database runs your query. It may spawn 1 or more processes to get the data, do all the sorts, merges, etc. You don’t really care about how it gets your data, but it’s there to see in the plan.

The database tells the requestor that their query has executed.

You ask for the data.

You asking for the data is usually an assumed part of running the actual query. What good is running a query without seeing the results? However this last phase is what is known as a ‘fetch.’ Records are retrieved in batches. The size of the batch is dependent on what you have ‘Sql Array Fetch Size’ set to. The default is ‘100.’

Fetch! SQL Developer ALWAYS comes back with the stick 🙂

This means that SQL Developer will by default return the first 100 rows of your query. You can read more of the JDBC nitty-gritty here if you’re morbidly curious.

We consider this a low-level preference that shouldn’t be tweaked unless explicitly advised to do so by Oracle Support.

What’s really most important to understand here is that that many of your queries return MORE data than what is shown on the initial display of the results grid.

The first fetch is often sufficient to answer most questions, but...

So how do you get the rest of the records?

Click in the grid so it has focus.

Ctrl+End

This will force the rest of the records back into the grid.

And now you will see something like this

Fetch time can often exceed execution time!

Now with all records returned, Export will use the data in the grid instead of re-querying the database.

Clear as mud?

But WHY does SQL Developer force the 2nd query execution?

Good question.

If you do force all of the rows back into the grid, you are going to be consuming a decent amount of memory. Some query result sets are larger than others, and not all of us have 64 bit monsters to run our tools on. So, in order to conserve machine resources and memory, we just run the query again and write the data directly to the destination and bypass the data grid.

Tell Others About This Story:

Comments 57

  1. Hello Jeff
    Is there a way, how to sort data in the grid without re-running the script? Even all the dtata is fetched, when I asked SQL Developer to sort the data for me, it runs the script again, which takes time
    Thanks

    1. thatjeffsmith Post
      Author

      Mmmmm, asking for a sort does a database level sort, i.e. ORDER BY…which of course means you need to run the query again.

      If that’s too painful, maybe copy your data to Excel?

  2. Hi Jeff,

    This is a really helpful tip. I would love to do this on my Mac but I can’t find a keyboard shortcut to go to the last record in the grid. Is there one?

    1. thatjeffsmith Post
      Author
  3. Hi Jeff,

    do you know whether it is possible to export to csv using a semicolon as a delimiter when spooling the export?

    1. thatjeffsmith Post
      Author
        1. alter session set current_schema = abc;

          set sqlformat delimited ;

          spool l:\voorbeeldcsv.txt
          select /*csv*/’*
          from table;
          –spool off

          This is a simple form of the query I’m using.

          And the output:

          session SET altered.
          line 3: SQLPLUS Command Skipped: set sqlformat delimited

          1. thatjeffsmith Post
            Author
        2. thatjeffsmith Post
          Author
  4. That Jeff! 🙂 You helped even a novice like me! Jeff, I appreciate your posting this valuable information. It is easy to understand and I got the results I needed! Thanks, Carol

  5. Hello sir,

    I’m having a query result that has around 5 millions rows and it takes me like forever to export it to csv (I’m using SQL Developer).

    Is there anything I can do to optimize it?

    1. thatjeffsmith Post
      Author
    2. Hello sir,

      Thank you for your instruction. I followed it. But my SQL Developer stopped writing data when it reached a certain amount of data. The progress bar keeps running but no data exported.

      If I try to use Export wizard, it would be freezed when 2400 rows are exported, and with the same symptom: The progress bar keeps running but no more data are exported.

      Please kindly review my case.

      Thank you

      1. thatjeffsmith Post
        Author
        1. Hello sir,

          I’m trying to export to csv file. And data size has been optimized to around one hundred thousand rows.

          By the way, I’ve discovered a method to export my data by using Rstudio. So this is not my big problem anymore. But yes if SQL Developer can do it itself, it would be more convenience.

          Thank you

          1. thatjeffsmith Post
            Author

            100k rows to csv shouldn’t present a problem to SQLDev.

            You can try this as a workaround.

            In a worksheet:
            spool c:\your_file.csv
            select /*csv*/ * from your_table where…;
            spool off

            execute via F5, minimize the script output panel

  6. How can i extract out put data in user defied format in excel without using any other language except SQL or PL / SQL queries.

    1. thatjeffsmith Post
      Author
    1. thatjeffsmith Post
      Author
  7. Awesome!! I can’t wait for my current export to finish, c. 2047-2048? so I can try this out. Thank you so much for this article!

    1. thatjeffsmith Post
      Author
      1. It was a compliment not sarcasm. My exports have been taking forever. Now I know both the why that is and how to fix it. Cheers!

  8. hi Smith,

    is there any way i can export the ddl statements of a table to a text file programatically(using java)?

    for example i have a table X with 2 columns A and B so i want to export the insert statements to a file
    ie
    insert into X (A,B)values(value1,value2);

    Thanks for the above post it really helped me .. 🙂

    1. thatjeffsmith Post
      Author
      1. hi smith,

        I have tried your suggestion, i was able to get the create statement for a table.

        SELECT DBMS_METADATA.GET_DDL(‘TABLE’,’EMP’,’SCOTT’) FROM DUAL;

        but i dint find a way to get the insert statements from a table.

        actually i’m trying to make a small data movement app.. which will reduce the manual effort of moving data to test environment.

        i tried hard coding using string buffer and other stuffs.. in java it works but not efficient and need to write lot of code… since database has lot of tables.

        If u know a better online material for my query please share the link.

        thank you 🙂 .

        1. thatjeffsmith Post
          Author
          1. Hi ,

            Because if you want to move a small data to test environment you have to move so many related table data’s, since the application is large and the number of tables are more, its very difficult to move data manually for each table.

            some times test environment may not have some master table data,so such things need to take care programmatically.

          2. thatjeffsmith Post
            Author
  9. But if you run a query which returns lots of rows, if our aim is to save the results to a file, why can’t we just define the Export options straight up and run the query (instead of first having to fetch results to the grid, wait till the query ends in 20 mins for the Export button to become available, and then wait another 20 mins to re-run the query). Hello ?!

    1. thatjeffsmith Post
      Author

      Populating the data grids with the data is expensive. When fetching the records down on the 2nd execution, we write them directly to the file/buffer/clipboard w/o populating the grids.

      1. hi jeff, yes indeed it’s expensive – and that was the point of my rant, why not bypass completely the 1st execution into the grid?

        1. thatjeffsmith Post
          Author

          Maybe we should start over, but we put data into the grid because people want to see the data when they run their queries.

          If you want to just send your data straight to a file, run it as a script and do something like this…

          spool c:\data.csv
          select /*csv*/ col1, col2, … coln from table where…;

          And minimize the script output panel.

          That will do everything you want in one step.

          1. Hallo,
            I’m trying to download my table from the server to the local disk with script like below, but no output file is created. What causes the problem. Thanks in advance

            spool C:\erasmus.txt;
            select /*csv*/ * from Erasmus;
            spool off;

          2. thatjeffsmith Post
            Author
          3. Thanks for the information.
            I used thte same command, but I get following message.

            Cannot create SPOOL file c:\data.csv

            I am using Oracle SQL developer Version 3.2.20.09

          4. thatjeffsmith Post
            Author
  10. Hi Jeff,

    Thanks a lot for your help. Still I want to ask about logical flow of a export to excel functionality. I mean that how the process should get followed. i.e first It should executes a query and then adds data one by one column into the excel file. Could you please provide me a abstract logical flow that will help me to understand the functionality and to implement the same if required.

    Thanks.

    1. thatjeffsmith Post
      Author
  11. Hi Jeff

    I have to implement export to excel functionality in oracle procedure. Actually I am running a query inside a procedure and wants to put a data of that cursor query into the excel file, generated excel file will be placed on file server.

    I am not able to implement this functionality. Could you please guide me?

    Thanks a lot.

    1. thatjeffsmith Post
      Author
  12. Jeff: I’m trying to create job and schedule that will export data from a table on a nightly basis. I’m not quite sure how to do this. I’ve got SQL to create table but don’t know how to create job or program to automate. Could you show me steps to do this in SQLDEVELOPER please ?

    1. thatjeffsmith Post
      Author

      I could direct you to the wizards – just right click on the Scheduler node in your database tree and use the New Job Wizard.

      Otherwise, I would advise you to read the DBMS_SCHEDULER docs.

      What you want to do is possible. I’m betting Tim over at ORACLE-BASE already even has an example for you to place with.

  13. You directed me back here 🙂 – on purpose ?

    If so then both of the above scenarios (Query + Export and Query + Ctrl-End + Export) both execute the query twice, and forces me to monitor the extract in order to start the extract 🙂

    1. thatjeffsmith Post
      Author
      1. Thanx Jeff,

        I am using: Version 4.0.1.14 – Build MAIN-14.48

        I just traced both scenarios, and despite my perception that it executed twice, the latter suggestion (Ctrl+End) actually only executes once (it just takes a long time buffering the data (350k rows) in mem 🙂

        For a scenario like mine, where the query takes close to two hours, it would be nice if it wasn’t a three step process to get from query to exported Excel file (Query, Ctrl+End, Export). E.g. Make Ctrl-+Shift+Enter pop the export dialog and voila – more happy (heavy) users 🙂

        Thanx for your time and an invaluable product !

  14. I know this is an old question but I have a huge query and wan’t it in Excel. It takes 6000+ seconds to run (optimized :)) and returns 350k rows.

    I would love to avoid having to run the query once to see 50 rows, and only then getting the option to export to Excel, and having to run the full query again.

    Is there a way?

    1. thatjeffsmith Post
      Author
  15. Many thanks. What you described is working really cool for me. I wish oracle add those tips in the help section. In otn forum, it was said to add AddVMOption -Xmx1024M to conf file, but sql developer will not start with that param.

    Thanks again. This was a great help.

    1. JeffS Post
      Author

      I’m guessing SQL Developer wouldn’t start with that conf item because you don’t have enough free memory. Are you trying to export to Excel? If so, try exporting to CSV instead and let Excel convert the file on opening it. No extra memory necessary (probably!) and should run much faster.

    1. JeffS Post
      Author
        1. JeffS Post
          Author

Leave a Reply

Your email address will not be published. Required fields are marked *