Grasshopper

algorithmic modeling for Rhino

Hi,

I'm new to ELK, and relatively new to Grasshopper. I am trying to create a map of Paris. however, the OSm file contains too large amount of data. And in trying to create the buildings/streets/transport, it crashes( it crashes at the buildings step.

Is there any way to make it more 'operable'. As in to be able to work with it, without risking to crash on every step. ( I do know that the slow operation is due to the huge amount of data, but I just wonder whether there is any way to work with it).

In terms of location, it's the map of Paris that I'm trying to work with. Basically i downloaded it in the api format, and then tried to convert it to a osm file. (still not certain of the correction of my actions, and whether I got the result that I was supposed to get).

Unfortunately I can't attach the map file since it has 400MB. But here's a link ( http://www.openstreetmap.org/export#map=12/48.8539/2.3078 )

Thank you to all kind soul for saving me from the well experienced "Not Responding".

Views: 1602

Replies to This Discussion

Hi Archiheart.

You will have to attach the GH file you are using to load the data into, which shows the subsequent steps you are trying to take.

Also tell us how you are importing the data into GH. Have you tried with a small subset of the same data, like just one block of the same data? Did it work? Did you use Profiler to note which components are the most time-consuming? If you can see the total time that it takes for example for data that contains 100 values and the total set has 10.000 values than you can safely say, that it will take at least 100 times as long. Have you checked the Windows Task Manager while it is running to see if it is actually running out of Memory?

Until then I don't think we can help you. GH not responding is very common if you work with a lot of data and/or do a lot of operations that can take a long time. I dont know why it sometimes freezes completely, sometimes can be paused using Escape, sometimes saves a Recovery File and sometimes doensn't. I even have cases where I can't even force quit Rhino, but have to actually restart the computer. Maybe David Rutten could enlighten us about this, as I have always wondered about this and its so frustrating having to restart Rhino a lot.

To prevent crashes its also a good idea for example to Disable the Solver and then Recompute manually once you have done the changes you want. A lot of times GH freezes because you forgot to Graft or Flatten one of the Connections and then all the calculations get multiplied.

Hi Armin,

Thank you for responding,

Well, as I said , I am rather new to elk. My usage of ELK is limited to 

http://blog.alexwebb.com/?p=1343  and https://www.youtube.com/watch?v=8_uQj_Rr8eA 

Thus I import it through 'file path' , then 'Location'. That is the problem that it has an excessive amount of data inside it. And I don't know how to do to limit/select that amount (nor could I find an understandable way to be honest).

Concerning these questions: "Have you tried with a small subset of the same data, like just one block of the same data? Did it work? Did you use Profiler to note which components are the most time-consuming?" 
I'm not sure I understand you, but in case you're telling me to try to open it through a text editor, to see the data I need, I did try it. But it took allready too much time to open, thus finding the data I need was unbearable.

And yes, I did check the task manager, and it shows that I have no free ram memory while processing it. There are about (~700 000 values). 

Alright, looking at that link it looks like it works with a smaller subset of the data. What I meant with "just one block" is that you use the same map of Paris, but don't zoom out so far and just try to see if it works with a small section.

But so tell me about the moment that it freezes. Does it freeze even when loading in the file? So if you just have the component that reads the data and dont connect anything to it, can you successfully load the data and see the output has 700 000 values inside? If this already works than you can simply use the list item component to just get a few of the values and then ramp it up, like this:

So what you can see here is at first I have a series with 1 million values and then I just take a small set of that using list item with another series. The data that goes into the List Item can be anything. Already just the list item takes 1.4 seconds and the whole thing takes 3.2 seconds to calculate, which is quite long for something so simple. Rhino reports 980 MB memory usage for this. If I crank it up to 10 million the calculate time is 28.6 seconds and memory goes up to 3.2 GB. So yes I can believe that what you are trying to achieve will push the computer really hard.

Can I ask what computer you have? How much memory does it have?

To do things for an entire huge city like Paris is starting to be something you wouldn't do on your laptop or an old PC, thats why researchers and scientists have expensive workstations with large amounts of RAM. Can't you just use a smaller city or just a part of Paris? Is there a reason it has to be ALL of Paris? In the end its what you do with the data that has to be cool and not just the fact that you used A LOT of data. So rather do something really cool with a small set of data than do something thats limited because there is so much data.

It freezes at the moment when I try creating the a polyline for the 'buildings' feature key . It loads ok I guess, it does take a little while after connecting it to the 'Location' component, but not too long. Anyways I could get to see the number of components within the file.  

I have a Dell Inspiron 17R(5721). With the 8 gb ram. It works allright, with a little portion of the map, as in I took the region I would be working on, but I need the entire map for a research/presentation I am doing. Thus a vectorised map, and the possibility of extracting a certain information, would save me a great amount of time.(Unless figuring it out would take more time of course)

Hm ok, well then it will just take a long time to calculate and need a lot of RAM. Once your physical RAM runs out and it has to start using the hard drive things will get really really slow. 

Send me the GH file you are using and I will download the map and try it out on my machine here. Its a top-of-the-range 27" iMac with 16GB of RAM. If it works I can send you it back as a Rhino File. If you do, then please put some notes of what goes where. I will download the maps data from the link you gave.

Hm, actually openstreetmaps keeps giving me an error when trying to download the data :/ Maybe you can put the data on dropbox or wetransfer or something.

Attachments:

Ok, so I had a look at it. It seems ELK is just super slow for some reason. I downloaded a software called OSM IQ, which can read and export OSM data and that opened that file you sent me in a few seconds. I could then export the buildings, natural and highways to .dxf and import it into Rhino.

I am attaching a link to the .3dm file with buildings, highways and natural on layers. From there it is easy to get them into GH and do what you need with them. That way you wont need ELK at all and can concentrate on actually doing something cool with the data :)

https://www.dropbox.com/s/bybo4krodje6i0e/Paris_natural_highways_bu...

Its funny that you are using a map from Paris. I went to the Informed Matter Workshop in Paris a while ago and for our last little project called "lifeline" we were using data of paris, in this case the metrolines and population data. This is population density along the metro lines:

Maybe they will have a workshop again, it seems like you should go to it:

http://www.co-de-it.com/wordpress/informed-matter-gh-ws-paris.html

Thank you very much :) although I would really like to know how to do it in ELK, but there might be a problem within the PC or maybe even ELK :( 

About that workshop: is it like a temporary program, or is it to get a masters degree? Because I looked through it, and it was a 12 month program for masters degree. so I'm a bit confused. (me being in 2nd year architecture school, don't think I could go through it. Although, having more knowledge about the computational design, would help me a lot, but being a 12 month program, might leave me out of time to focus on school project.)

Is it just me who got it wrong?

Hi again. No it was not a masters program. They also offer that now and I would love to go, but can't this year, so will hopefully do it next year.

This was a 4 day workshop that was all about using Data in Grasshopper. So we learned a lot about data sources, environmental data, etc. and how to use them in GH. It was a great experience and was my start with GH. I knew it a little bit, but I learned so many things about it there. Not sure if they will do it again, but it was quite successful, so maybe they will.

I have not used ELK before, so can't comment on it, but I can confirm that it took a really long time. But for the most part even for a very large architectural project you wont need OSM data for a whole city, especially not one as big as Paris. So if you want to use ELK use it with normal size maps like a few hundred meters square, because 99% of projects wont be bigger than that. If they are then you will have the proper resources ;)

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service