Answers

2016-01-29T13:40:31+05:30
Yes, it brings self confidence in them and even it make them gain self  respect for themselves in the society. By doing jobs after their studies they can stand on their own feet in the society.
1 5 1
2016-01-29T14:29:21+05:30
No....i dont think so.....girls should get education.......if she get job...it wil bcome difficult to manage household and also job....It will become a big burden for her...It is the duty of Man to take jobs....if women wish to work then they can.....
1 5 1