I am new in Spark, So i want to know how we access in memory table in other spark scala shell session.
Thanks in advance!
So that's not currently supported. If you want to share RDD's between jobs you should take a look at either IBM's Spark Kernel project or the Ooyola Spark Job Server which allows for sharing a Spark Context between multiple applications.