Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Vehicle Auto control #793

Open
Malathi15 opened this issue Jun 6, 2020 · 17 comments
Open

About Vehicle Auto control #793

Malathi15 opened this issue Jun 6, 2020 · 17 comments
Labels
answered Waiting for response enhancement New feature or request

Comments

@Malathi15
Copy link

Malathi15 commented Jun 6, 2020

Hi,
I have tried to create an Autonomous vehicle software based on Deep Reinforcement Learning. I am using LGSVL Simulator for a research purpose.

System specification:
OS: Ubuntu 18.04
Memory: 15.7 GiB
Processor:AMD® Ryzen 7 3800x 8-core processor × 16

I have downloaded LGSVL Simulator from this github page https://github.com/lgsvl/simulator also I have referred to this documentation https://www.lgsvlsimulator.com/docs/build-instructions/.
After Build LGSVL Simulator I need to run it in auto control mode, But the Simulator runs only in a manual control mode. Please check the attached file (json script I have used for vehicle)
Jaguar2015XE (Autoware).txt

Can you please clarify my following doubts:

  1. How manual controls can be used in Reinforcement Learning training? For an autonomous vehicle?

  2. Should the Autonomous Host Vehicle control be written as a Code?

  3. Do we need to write code to feed in sensor data first to train and make the car to drive itself in Autonomous Driving Mode?

Thanks in Advance
Malathi K

@Malathi15 Malathi15 changed the title Not able to use Vehicle Auto control About Vehicle Auto control Jun 6, 2020
@ntutangyun
Copy link
Contributor

Hi, @Malathi15 I'm no expert here.

But if you're building an end-to-end solution, and if you want to use ml-agents, maybe you could refer to the codes at Assets/Scripts/Api/Commands on how to collect necessary observations (sensor data) and how to apply actions (controls) to the ego vehicle.

for your doubts:

  1. if your solution is an end-to-end neural network, which takes raw sensor data as input and outputs control signals, you probably won't need manual control during training. but if you wish to do imitation learning, then you will probably need to capture manual controls during training as well.

  2. aren't you using deep networks to output control signals?

  3. yes, you may checkout the codes in API on how to get sensor data internally (within unity) or PythonAPI on how to get sensor data externally.

Again I'm no expert here, please correct me if I'm wrong. thanks.

@Malathi15
Copy link
Author

@ntutangyun Thank you so much for your reply.
I am new to LGSVL Simulator.
Can you please tell me how to change manual control to auto control?

Thanks
Malathi K

@ntutangyun
Copy link
Contributor

@Malathi15 could you be more specific on what types of auto control you're looking for?

Do you mean that the ego vehicle drives by itself?

@Malathi15
Copy link
Author

Malathi15 commented Jun 6, 2020

@ntutangyun
I need the host vehicle drives by itself (based on the condition we have). Is it possible?

Thanks
Malathi K

@ntutangyun
Copy link
Contributor

@Malathi15 I don't understand your problem here. If the host vehicle could drive by itself, then what do you use reinforcement learning for, and how do you plan to train the network?

@Malathi15
Copy link
Author

@ntutangyun
In ML-Agent, we need to change brain type to VehicleBrain so that the host vehicle run automatically while training.
In LGSVL also have those type of control changes for training? (I am new to lgsvl simulator environment).

Thanks
Malathi K

@ntutangyun
Copy link
Contributor

@Malathi15 I'm not aware of such type of control change in LGSVL.

Maybe the team could help out @EricBoiseLGSVL

@guruvishnuvardan
Copy link

guruvishnuvardan commented Jun 6, 2020

Hi @ntutangyun,

I am Guru from @Malathi15s Team.

Answering the following question:

aren't you using deep networks to output control signals?

Answer:

Yes, We are using RL, Based on observations (sensor data), the vehicle need to pick the autonomous actions (controls) like Accelarate, Decelerate,Turn Right / Left, Reverse, Brake during RL training and Reinforcement Learning Training will determine if the action is right or not based on reward points.

Can you please let us know, if we can perform such things while training, since we are not doing imitation learning and we expect the system to learn automatically by RL

Thanks
Guru

@ntutangyun
Copy link
Contributor

Hi, @guruvishnuvardan I haven't tried ML-agents on LGSVL yet. But it should be possible. you may checkout the scripts under Assets/Scripts/Api/Commands to see how to get those sensor data you need and apply RL output to control the ego vehicle.

@deepakramani
Copy link

I'm also not an expert. I will try to answer some of your questions based on my experience.
Since you want a vehicle drive automatically, you would need to use PythonAPI. You program it how you want to drive. There are examples available to get you started. You then, need to add sensors as needed.

For RL there is a doc. This may be a bit outdated but should be enough to get you started -- https://www.lgsvlsimulator.com/docs/openai-gym/

@rongguodong
Copy link

rongguodong commented Jun 6, 2020

@ntutangyun
I need the host vehicle drives by itself (based on the condition we have). Is it possible?

Thanks
Malathi K

I think you need an AV stack (e.g. Apollo, Autoware, or your own algorithms) to control the ego car. The simulator just simulates the environment, sensors, cars, etc. There is no AV algorithm in the simulator.
You can treat the simulator as it gives you a virtual environment and virtual car equipped with lots of sensors. But you need to provide AV software to it (or manually drive the car).

@EricBoiseLGSVL
Copy link
Contributor

@Malathi15 @guruvishnuvardan You could extend the VehicleSMI class to follow the HD map. You would need to look at NPCController.cs and FollowLanes code that NPCs use to follow lanes. This could be adapted into a controller for the EGO that would use Python API data for control and decisions. Like @rongguodong and @dr563105 has already stated, this is possible but will require a good amount of work because the simulator is designed to be used with an AV stack. Training is important so please let us know how we can help or if you have questions.

@EricBoiseLGSVL EricBoiseLGSVL added answered Waiting for response enhancement New feature or request labels Jun 8, 2020
@guruvishnuvardan
Copy link

Thanks @EricBoiseLGSVL, @rongguodong , @dr563105 , @ntutangyun.

We will work on the suggestions and get back to you, As we are in the process of understanding LGSVL Simulator, Vehicles, Sensors, OpenAI and RL.

Regards
Guru

@ehsan-ami
Copy link

I desperately need to have an auto-control module for data collection purposes. This is helpful when you are working on ML-based vehicle detection, tracking, and prediction parts. Using random seeds and having the NPC like behaviors like follow_closest_lane, obeying the traffic rules for the Ego vehicles, I can automate the data-collection process, which is highly labor-intensive without this module.

I will be appreciated if you have other comments or if you tell me other possible approaches.
Anyway, I believe that having a simple auto control module would be beneficial for many users for data collection and machine learning purposes, and there is an excellent motivation to add this to LGSVL.

Best regards,
Ehsan

@EricBoiseLGSVL
Copy link
Contributor

@ehsan-ami This issue has multiple ways to automatically control the ego vehicle (PythonAPI, AD stack, edit VehicleSMI and take code from NPCControl logic) What issue are you having implementing these solutions?

@ehsan-ami
Copy link

@EricBoiseLGSVL I am looking for automatic control of the ego vehicle for the purpose of data collection; using the random placement of NPC objects in the scene, randomization of the weather condition, and the NPCControl logic, you can create various traffic scenarios and collect many data snippets (e.g., 200 20 sec data snippet in an intersection) in the same environment if our ego vehicle could behave like an NPC vehicle.

I couldn't find a way to do this using Python API. On the other hand, using an AD stack is not optimized for running lots of short scenarios. The only reasonable thing that I thought about is editing the simulator's source code. Since I am not familiar with the code structure of the LGSVL, so I appreciate some general sense of the task and general hints.

Besides, since this feature is needed by users who need to collect their own dataset using LGSVL, which I think there must be a reasonable number of users, my suggestion and request is that the NPCControl logic to be added for the ego vehicle in the future releases.

Many thanks,
Ehsan

@EricBoiseLGSVL
Copy link
Contributor

PythonAPI is setting the path for the EGO and will not work if you want logic for pathing.
AD stack would be difficult for many short runs you are correct
You can alter VehicleControl/VehicleSMI to use the code the NPC's do in LaneFollowBehaviour. This would allow the Ego to follow lanes the same as NPCs.
We agree that this would be a good feature but since the simulator is focused on evaluating AD stacks at the moment, we have not been able to implement this. Hopefully we can add this to the roadmap in the near future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
answered Waiting for response enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

8 participants
@rongguodong @deepakramani @ntutangyun @guruvishnuvardan @EricBoiseLGSVL @Malathi15 @ehsan-ami and others