Other forms: instituted; institutes; instituting
An institute is an organization or association designed to study or promote something. If you're interested in politics, you might want to do an internship at one of Washington D.C.'s many political research institutes.
While you may have heard of an institute, whether it’s the National Institute for Art Advancement or the National Cancer Institute, you may not know institute in its verb form. To institute something means to establish or advance it. You might institute the hiring of Spanish-speakers at your company, or, if workers complain about being overworked, you might institute a new policy on taking breaks.