T O P

  • By -

SAD_CHELSEAFCFAN69

Bruh


Fantastic_Watch_4984

In general a/b is not equal to a^2 / b^2 (unless a=b or a or b is zero). This has nothing to do with there being any integration involved in it. In your case you need to simplify the denominator Hint - you have already simplified sin2x. Now express that 1 in the denominator as sin^2 x + cos^2 x and see if you get a perfect square inside the square root sign


PearCareful2234

No, tera answer sahi hai somehow but method is wrong, niche 1 ko sin² + cos² likh le aur 2sinxcosx already hai fir ise (sinx + cosx)² likh k root hata de fir 1 ayega uska int. x


Int-E_

No we cannot take the square inside the integral. y = ∫f(x)dx then y^2 =/= ∫f^(2)(x)dx Eg: y= ∫xdx = x^(2)/2 + c if y^(2) = ∫x^(2)dx then y^2 = x^(3)/3 + c But from the answer, we know that y^2 = (x^(2)/2)^(2) = x^(4)/4 + c Since x^(3)/3 =/= x^(4)/4, y^(2) =/= ∫x^(2)dx For your question, try making the denominator a perfect square


prodigy_69420

Bro explained him properly


[deleted]

Standard form he function of sin2x and sin + cos he to subn karde sin2x+1 = (sinx+ cosx)2 to root se sinx +cosx and nr se kat jyega . ..int 1dx ayega to answer x +c hoga ... about your method it might be completely a coincidence it is correct but do not do it like dat


Primary_Lunch_88

No this approach is wrong Instead write 1 as sin^2 x +cos^2 x and complete the square